Transcript

Aaron Powell: Welcome to Free Thoughts, I’m Aaron Powell.

Trevor Burrus: I’m Trevor Burrus.

Aaron Powell: Joining us today is our colleague Will Duffield from the Cato Institute’s First Amendment Project. Welcome to Free Thoughts, Will.

Will Duffield: Thanks for having me.

Aaron Powell: The First Amendment and with it a lot of our notions of how we think about freedom of speech, how we think about freedom of expression developed in a time when most of our communication or all of [00:00:30] our communication occurred either face‐​to‐​face or via print. So, in what ways has the shift to digital communication and in the digital world as a medium for communicating and expressing ourselves complicated the picture, complicated the way that we think about these issues?

Will Duffield: Well, it’s lower the barriers to entry in the marketplace of ideas tremendously. Anyone who can afford a cellphone, who can afford [00:01:00] an internet connection or even wander into a library can begin speaking to audiences of hundreds of thousands if not millions. With that we’ve seen many of the old gatekeepers fade away and in many cases have been replaced by enablers. Companies, intermediaries or people who whose business it is to get others speech out there [00:01:30] to provide them with a platform and that’s been pretty revolutionary.

Trevor Burrus: It seems that we have this extra problem though because if we’re all on this third party, we call it gatekeepers, so the New York Times was a gatekeeper. I mean, it still is but it used to be a more important gatekeeper in terms of whether you could get into its pages but now you can broadcast, as you said yourself, but then you’re also using someone else’s technology or like the Internet ISPs and things like that to get [00:02:00] your voice out there, which is very different than giving a speech on a platform‐

Aaron Powell: The New York Times owned it not only was the one who was putting out the speech but it’s the one who owns all the mechanisms for the distribution of that speech.

Trevor Burrus: Exactly, but now it’s different.

Will Duffield: Yes. And there is certainly growing concern that many of these enablers are either voluntarily or being bullied into becoming a new set of gatekeepers [00:02:30] but at the same time this is a technologically aided development, it’s not reliant upon the corporate structure of Google or Facebook. If you see significant changes in the current enabler gatekeeper market, you’re likely to see competitors arise that give people what they got from the internet a couple years ago.

Aaron Powell: I’m going to ask question, I don’t know if this question quite makes sense [00:03:00] but I’ll roll with it. So, you’re distinguishing gatekeepers from enablers but in a sense the old gatekeepers whether that was the New York Times or Random House publishers or NBC were also enablers. Because their job was to facilitate people getting their speech out to an audience. It was just a smaller group of people that they were facilitating. Did the switch from [00:03:30] the gatekeeper role to the enabler role, was that purely a technological switch? I the sense that the reason that say Random House acted as a gatekeeper was because it simply was not economically viable for it to publish the writings of everyone.

So it had to be selective and then it had to choose how it was going to be selective or was it a cultural shift in the sense [00:04:00] of kind of a change from elitism to populism. That in the past we thought the only people worth listening to were the ones who could make it through a gatekeeping process whereas now we think that every person’s opinion, every person’s thought is the kind of thing that ought to be broadcast to everyone?

Will Duffield: I think it’s much more technological and frankly I’m not sure if we’ve made that cultural shift at all. If you look at something like [00:04:30] Amazon Books, the Kindle within their publishing environment, they’re publishing nearly anyone today. And they’re able to do that because they can publish books digitally in a way that even if they wanted to do that had some kind of economic incentive to do that 20 years ago, it would have been incredibly costly for them to do so. It’s not as though throughout the past 50 years you haven’t had people pumping out homemade scenes but again their [00:05:00] distribution routes were much more expensive if they wanted to really push it out there. So you see a lot of that being handed out on street corners where is now a blog link can be shared around the world in a matter of minutes.

Trevor Burrus: I think one of things you might beginning at Aaron as the anti‐​intellectualism or anti‐​elite kind of attitudes that the idea that some important person is been given permission by the New York Times to tell you things that are important [00:05:30] is being seen with less esteem than it had been in terms of being against intellectualism on the right and being against elites on from whether it’s Bernie or Trump. And so now it’s just, my next door neighbor on Facebook totally said that the Russians have infiltrated the White House and that’s as valid as George Will.

Aaron Powell: Right. I wonder, so if I can kind of modify what I was getting at then in light of that. I wonder if there’s almost a feedback loop here? Where it used to be that you only [00:06:00] heard from people with a certain level of call it, professionalism because they made it through the gatekeeper process and so we as society … That was to also who we looked to for information and we can’t thought of them as somewhat authoritative. Then the technology changed, so the reason that we couldn’t hear from everybody back then was because the technology simply didn’t allow it. The economics didn’t work out, the distribution [00:06:30] wasn’t fast enough and so on.

Then the technology changed and it became possible for say, Amazon to let anyone publish a book on their platform for free or anyone can tweet, anyone can post on Facebook, anyone can start a blog. So the technology shifted to enable all of that vast group of content creators to create content, distribute it and then because they could do that, the culture then shifted as a result of thinking. Well, [00:07:00] now because we’ve all got a platform, all of what we have to say online is equal or equally authoritative or the notion of gatekeepers themselves is suspect and it’s more of a … It shifts more to that populism thing. So, they’re playing off of each other.

Will Duffield: I think that’s certainly valid. With the rise of social media in particular distinct from the rest of the Internet, you’ve gotten new metrics [00:07:30] to judge not so much expertise but impact and I think that works with this feedback loop very effectively.

Aaron Powell: So it’s no longer about experts, it’s about influencers.

Will Duffield: Yes. I mean, if you go back to the late 90s, early 2000s you saw a big blog ecosystem. That was quite viable but someone’s reputation in that ecosystem was somewhat word‐​of‐​mouth. There wasn’t data on it or not publicly accessible data whereas today [00:08:00] you look at someone on Twitter and you see that they have 30,000 followers, so they must matter in some sense or to a lot of people, that means that they matter.

Trevor Burrus: Generally speaking libertarians have been answered, so to speak, to these questions. To everything. Yes, no. To these questions of the First Amendment’s and how it applies to say Google. And so we’ve had some controversies or Twitter. We’ve had some controversies such as people trying [00:08:30] to sue Donald Trump for blocking them on Twitter under First Amendment grounds and you came to my office and brought this up and the first thing I said was well, the First Amendment applies to government, Twitter is not the government, move on. That ends the inquiry.

We have also discussed Google changing the rankings of Russia Today in terms of how much it will appear in your search and one, you could again say, First Amendment, why’s the government, Google’s not the government, [00:09:00] move on. But you increasingly think that, that’s not sufficient to address the concerns that need to be addressed correct?

Will Duffield: Well, I think we ought to beyond suppose wrote libertarianism have a certain liberal conception of speech, what a speech environment ought to look like, how accessible speaking should be to people. And that requires some slight shifts [00:09:30] in the analysis not whether you can come and bring a First Amendment claim against Google over D ranking, no. I think that would be ludicrous. The twitter case is a bit different because it hinges upon whether or not Donald Trump is speaking as a government official and therefore actually has less to do with Twitter itself within that First Amendment analysis.

Trevor Burrus: Are we going to have Congress … [00:10:00] You made me think about things in conversations we’ve had especially with something like Google where the standard libertarian response is well, start another Google. Of course Google has to rank things because it would be just noise otherwise. If you search for something and you are as likely to get Bob’s blog as Wikipedia, it would be a worthless thing.

Aaron Powell: They’d have to show you all of them simultaneously.

Trevor Burrus: Exactly. So, you almost have this choice architecture situation of the nudge world where it has [00:10:30] to be ranking and somehow but given Google’s popularity and its networking effects and we could call it a monopoly and of course the left would love to say, it’s a monopoly, it should be a public utility and then we’ll figure this out otherwise. Is there really a realistic alternative of another Google competing against Google or should we just regular Google?

Will Duffield: There’s certainly a tremendous first mover advantage in that market because you are gathering information about [00:11:00] the state of the world, what’s out there and sorting through it and creating machine learning algorithms in order to do so more efficiently. Which of course require a great deal of data to train and again, a tremendous first mover advantage. However, going back to the original first amendment question and what may complicate it, I think the fact that these are international firms matters [00:11:30] tremendously. Because they aren’t just falling under the US law, they’re operating under a host of legal systems and increasingly we’re seeing states around the world trying to force their interpretation of acceptable speech on the world as a whole or on the world as a whole through various social media firms.

At the same time even domestically, we see say in the [00:12:00] wake of this potential Russian meddling in our election pressure coming both from Google customers and from the United States government at the same time and in a similar direction. So, in sorting through whether or not some response of Google’s was bullied out of them, was arrested from them by the US government or simply them trying to satisfy their customers can be really difficult.

Aaron Powell: It seems like we might mean to [00:12:30] get somewhat abstract, we might complicate the traditional or see if we could complicate the traditional libertarian story of, well, it’s a private company they can do whatever they want and if you don’t like it start something to compete with them. By looking at it as you have a space, it happens to be a digital space but it’s a … It’s arguably a more important space than a lot of the physical spaces that were in as far as how we [00:13:00] communicate, how we interact, how we earn a living, how we engage with the economy. All of these necessary things that we have to do. You have a single entity that has extraordinary power within that space and it has the power to effectively compel you, if you want to be in that space to see certain things or not see certain things, to act in certain ways or not [00:13:30] act in certain ways, right.

I’m a bit overstating here for a fact but … If we say that, then we start looking in a cyberspace sense, do we start looking awfully close to Max Weber’s definition of the state which is that entity that possesses a monopoly on the use of legitimate force within a geographic area. That if our geographic area is the Internet, is cyberspace and we say that Google possesses [00:14:00] either monopoly or at least even if Google started doing really nefarious stuff and people were competing against it, it would still be a long time before anyone could supplant.

Trevor Burrus: Is force here kicking them off or something?

Aaron Powell: Force or force is … We tend to think of force. Force can be … You think of force as violence like I’m going to actually do physical harm to you. But we also tend to think of force as the ability to compel you to act or behave in certain ways. Like I’m going to use force to prevent you from taking [00:14:30] this thing, right. I’m going to use force to … I’m going to pen you in. If I lock you up, that’s using force but it’s not necessarily violence. So if they can dictate to a very exacting degree the kind of stuff that you can do within this space and that’s not quite the same as force but it certainly is compulsion.

In this kind of thought experiment, does that start to push these things into being the kind of entities within this particular space where [00:15:00] it makes sense to think of them more in a state way than as simply private firms in a thriving marketplace. I don’t know the answer to that question, I’m just kind of thinking out loud but I could see thinking in that direction.

Trevor Burrus: My thought are that the entrance ability even though there are large networking effects for something like Google and they have a first mover advantages will point it out, the ability to enter the market. When we talk about anarchic [00:15:30] governance, the polycentric law or something, generally I would suspect it in that world. The thing you’d be living under, whether it’s like a homeowner’s association or something like that would look a lot like a state but to the point that people would be like, isn’t this basically a state and you you’d have to say, well, no, there are some important differences here that are crucial.

Right of accent and all these things being of the case. When it comes to a digital world, we can’t fall to the kind [00:16:00] of consistently wrong problem for monopoly concerns. We’re prosecuting Microsoft because everyone’s … This is never … Nothing is ever going to replace Microsoft.

Aaron Powell: Or MySpace

Trevor Burrus: Or MySpace or any of that stuff‐

Aaron Powell: Nokia.

Trevor Burrus: Yeah, Nokia. We can’t do that. Just consistently wrong problem or it could happen very quickly when Google is doing something people don’t like and they say, hey, we’re a new company and we don’t do that thing that people don’t like what Google does. [00:16:30] Now, the problem is and this might just be the kind of thing where you have to accept the foibles of humanity because if you’re trying to do this from a standpoint of good governance and your traitor talking about things like fake news which we’ll be getting into here, right now. We’ll get into it, and if you’re saying well we need to have a good government so we need to make sure that people are properly informed and that does not include fake news.

Well, the thing is, is that people kind of demand fake news and so if Google … Imagine Google [00:17:00] deciding to censor a bunch of views or change its rankings according to taking Alex Jones down to the bottom and taking all these maybe with a decidedly right‐​wing built down to the bottom and so then right‐​wing Google pops up and it says, we don’t do that. We will show you the real results about the real news that they’re trying to suppress and it’s probably going to be mostly BS.

Aaron Powell: Well, that happened with this Gab, was that the right‐​wing like when Reddit started kicking all the alt writers and the [00:17:30] racists and anti‐​Semites off they started their own version of Reddit.

Will Duffield: This then gets us into the question of where censorship concerns are most valid and where competitors can potentially enter the market. We’ve been discussing things happening on the content layer for the first bit of this conversation and I see that as probably the least concerning element [00:18:00] of the internet from a censorship standpoint because at the end of the day, if you’re kicked off Twitter you can go over to Gab and you can create an account there and yes you’ll probably hanging out with a bunch of other alt right near D Wells who are also kicked off Twitter. But it’s an alternative and you can make an alternative that fulfills any kind of social function you’d like.

However, Gab was after … A couple months after its introduction kicked off both [00:18:30] the Google App Store and the Apple Play Store. At that point, for many cellphone users, it becomes much more difficult to access this because you’ve moved one layer down essentially. And as you move further and further down into the real infrastructure of the internet, it becomes harder and harder to come up with alternatives in order to route around that kind of censorship. So, again [00:19:00] it’s one thing if you’re bounced off Twitter, you can’t speak on Twitter anymore, you may feel censored but your ability to communicate with the world at large has not I think been tremendously undermined.

If, however. a DDoS protection firm like CloudFlare stops protecting your website that you’ve created or even if an ISP were to stop allowing your content to flow over their pipelines that becomes much [00:19:30] more concerning.

Trevor Burrus: We have a problem there with lack of competition as you go deeper into the … ISPs, you might not have that many available to you. At the same time, I was thinking about geo cities or one of the … Some of these old groups where if they wanted to kick you off because you had a Nazi geo city site, they could have and maybe they did, but if you don’t have a lot of [00:20:00] ISPs for you to choose between, this could become a problem, possibly.

Will Duffield: Yes. Getting all the way down to domain registrars.

Trevor Burrus: You can’t-

Will Duffield: At some point, you need to move to another internet, which doesn’t exist.Now, we aren’t anywhere near that level. ICANN seems very resistant to any kind‐

Aaron Powell: They’re the people who register domain names.

Trevor Burrus: They’re non‐​state organization, correct?

Will Duffield: [00:20:30] Yes. They get funding from a number of governments but they seem, again, very independent and thankfully so.

Trevor Burrus: Fake news, I want to get back to fake news because this is something that is particularly‐

Aaron Powell: It’s in the real news right now.

Trevor Burrus: Fake news is in the real news, yes. It’s particularly concerning to me. What do you hear when someone says fake news? I guess in the context like‐

Will Duffield: Fake news has been around‐

Trevor Burrus: When a congressman says, well that’s just fake news or when Donald [00:21:00] Trump says that’s fake news. What do you hear?

Will Duffield: I hear that they disagree with something someone else has said. Fake news has been around for a really long time, arguably heresy is the original fake news and it comes down to a disagreement about what the facts of the world are. However, we are or have been introduced to the Internet [00:21:30] coming off the back of one of the more centralizing communication technologies we’ve had in a while, television. So, in comparison to television, it’s much easier for people to get different perspectives out there on the Internet to disagree about what the world looks like. As many of the old gatekeepers have expressed frustration with this state of things, you’re hearing this fake news complaint rolled [00:22:00] around.

Trevor Burrus: So, it’s not a problem?

Will Duffield: Well, it’s certainly a problem. Epistemic uncertainty is a problem.

Aaron Powell: Fake news, I think, it gets used in different ways by different groups and I think that some of those ways are closer to what you’re describing Will and some are further from it. On the one side like with Donald Trump says fake news, the claim that he’s making … Yes, he’s addressing there was a claim that was made in a news [00:22:30] story or somewhere else that he doesn’t like. It’s not that he’s saying I disagree with it, he’s just saying it’s fake. Like the content and the … It doesn’t matter because it’s just fake, it’s not real, it’s made up on the spot or whatever else.

When Trump’s followers say fake news, when they think that the Washington Post made up the stories about Roy Moore molesting children, it’s just made up, it’s [00:23:00] not factual. I think that when most of the rest of us talk about fake news or when Congress is say talking about the fake news during the election like the Russian propaganda, what they mean is news that was even the people making it knew it wasn’t true. It was basically fiction from the get‐​go and was not intended to inform and so that the claim is more … [00:23:30] It’s less about I disagree with it, it’s less about it flies against my ideological foundations and more simply that this is intentionally pretend stuff intended to accomplish something other than informing people.

Will Duffield: Again, it doesn’t feel like anything new. Doesn’t anyone remember the main?

Trevor Burrus: Yes or Gulf of Tonkin or something.

Will Duffield: Yeah. However, I do think that the internet being something very new and something [00:24:00] which was originally conceived of by many as a store of all of humanity’s information is therefore socially uniquely vulnerable to the spreading of fake news. A lot of people seem fairly likely to believe what they read online. But at the same time we have to remember that we’re talking about politics here and it seems as though fake news about politics spreads much [00:24:30] more rapidly, is much more virulent than fake news about say, restaurant sanitation conditions or the usefulness of a given product.

Because like all political opinions, there’s no personal cost for being wrong online about something and in fact in many cases it feels pretty good to hold on to a nice‐​sounding belief that might not be correct.

Aaron Powell: [00:25:00] Is this new to … Is this new to politics because as you’re describing this, this sounds like so that the fake news that we talk about now is very much political because of the election and because of the scope or lack of scope of Russian meddling. But what you’re describing seems to fit quite well with say the fake news that was prevalent before that which focused much more heavily on science and nutrition and health. Like that was what people … Before we were griping [00:25:30] about fake news from Russian bots, what we would gripe about is like people are passing around these insane studies, so autism, vaccines or there’s a cancer cure, like this essential oil cures cancer.

People would share these things and you felt good about it and there was little cost unless you actually tried to use essential oils to cure your cancer. But it seems like that’s and that didn’t seem to upset us as much like it didn’t … [00:26:00] We didn’t have congressional inquiries into the spread of this stuff. We weren’t hauling Facebook in front of Congress because, what’s her name, Science Girl is that the‐

Trevor Burrus: GMO, Science Girl, whatever.

Aaron Powell: Because she had millions of followers and was spreading misinformation‐

Trevor Burrus: Complete misinformation.

Aaron Powell: So, it does seem like this is … This only really mattered when it hit the political world.

Will Duffield: I think [00:26:30] the reason for that is because people vote. At the end of the day, your belief about essential oils doesn’t really affect anyone beyond you. You can try to share that incorrect belief but you can’t use it to push a policy on anyone else.

Trevor Burrus: [crosstalk 00:26:52] lies or ignorance.

Will Duffield: Yeah, when you’re moving [crosstalk 00:26:53] might be a counter to that but‐

Trevor Burrus: Vaccines would be …

Will Duffield: But again that only seemed to become troublesome when [00:27:00] it entered into the realm of politics when you saw people in their local school board meetings trying to push to have their unvaccinated children allowed in a school. That was where that seemed to take off‐

Aaron Powell: Then instances of outbreaks of diseases in spots. So, on the political fake news, there’s a lot of calls to do something about it and there’s calls both for legislative [00:27:30] or regulatory change and calls for technological change. The platforms, the social media sites should do something about it. Do you think … Before we get into some of the specifics of things that have been proposed to be done about it, do you think it’s even the kind of problem that we can do something about? I mean, is this like to combat fake news first you have to know what it is [00:28:00] and you have to be able to say this is fake or this isn’t or this is the kind of fake that we should do something about and this isn’t the kind of fake we should do something about.

Will Duffield: Well, and it isn’t even a we, it’s of someone better than empowering someone to make these kinds of distinctions and presumably someone without the same sort of biases that hinder the rest of us. Which seems like a tall order.

Trevor Burrus: He’s been 10 years of intensive fake news training and he’s going to become [00:28:30] the minister of fake news.

Will Duffield: It’s fine then.

Trevor Burrus: Okay, absolutely. I think that Aaron’s question is interesting but you kind of defined it beforehand. I mean, if he was like it can’t be something that put a fire says pants on fire, it’s intentionally, knowingly, misleading‐

Will Duffield: Even the fact that it’s intentionally misleading doesn’t necessarily bring it to the level of being [00:29:00] harmful.

Trevor Burrus: Sure.

Will Duffield: Interestingly, last week Snopes debunked a duffel blog story and duffel blog is a sort of onion competitor for members of the military. I spread humorously false stories about life in the DoD. Now, on one hand it’s fake news but you don’t expect anyone to actually believe and act on it and therefore the idea of debunking a deliberately humorously false story [00:29:30] seems wrong.

Aaron Powell: So, that’s one of the one of the avenues of which this gets complicated because satire like what’s the difference between satire and fake news, but then also my definition of fake news with depending on intent gets us almost into like mens rea sort of problem where you can’t … I mean Facebook can’t it … Facebook knows a lot about us and maybe it can figure this out because maybe it’s incredibly more creepy than I already think it is but it probably can’t divine the intent behind [00:30:00] posting or creating. It can only look at the content that is posted or created and so that’s our bright line that might be an impossible one to recognize.

Trevor Burrus: Yeah, I think we also have to discuss the non because there have been some stuff proposed we can talk about in a second but we have to discuss the non‐​government solutions as problems like I always point out with the internet and it only just came up because … Especially those on the left, [00:30:30] they’re freaking out about Donald Trump’s election and they’re coming … Trying to come up with an explanation of it that they can maybe stop and so fake news gives them an opportunity to say, only because of Russian meddling, you know to say “We really had the right … We’re right, it’s just someone got in the way with their lies and so, we just have to fix that and then we’ll win subsequent elections.” Which of course is terrifying but I think it’s what’s behind us.

On the Internet in general, if you think about your history of your life in the Internet, it’s been a lot of having to [00:31:00] tune your BS detector. You could have used a click BT ad in 2004 that you would say you won’t believe what happened here or something that said … Different kind [crosstalk 00:31:13] this one weird trick or Nigerian scams or different types of ways of getting at you. They’re always updating their scamming and then that people are always having update their ability to detect scams and there’s a lag period. So, you’ll do the Nigerian thing works then I have to change it. It’s phishing emails, [00:31:30] work but then you have to realize they have to change it and also just being able to detect BS.

The the list, the clickbait stuff and I think with fake news we’re going to see a similar response, the people are just going to get better at spotting it. Not sharing it and not going and checking the story before they share it. If there’s someway to measure it, I suspect it’s already happening.

Will Duffield: I think you’re right about that particularly when it comes to the sorts of fake news that we’re seeing now which are usually text‐​based outrageous [00:32:00] stories. I don’t know how many people believe Bat Boy when they saw him on the side of the grocery store checkout aisle 20 years ago but certainly fewer people believe those stories today . Now, to some extent we do have to think about where this is going to go because at the moment it is mostly text‐​based but you’re seeing technologies coming down the pipe to losslessly fake both audio and video. Adobe’s project [00:32:30] VoCo is a good example of this with about 20 minutes of someone speaking can then start to create new conversations that they haven’t had.

Aaron Powell: There’s a video, we’ll try to put it in the show notes of a … It’s Obama and a couple of other politicians and they have … They show you them on one side talking and delivering a speech and then on the other side they have used this vocal synthesis software plus like basically a lip remapping to make it look like they’re delivering [00:33:00] entirely different speech. Watching them side‐​by‐​side you can tell that the fake one is a little bit fake because the mouth movements are a little bit … Are just slightly off but if you didn’t have them side by side, it would fool an awful lot of people.

Trevor Burrus: And that’s just the beginning, in ten years it will be indistinguishable.

Will Duffield: So that’s an area in which we need to be thinking about technologies which can combat that going forward because we as human beings with human eyes aren’t so good [00:33:30] at doing that. You may end up with a kind of race between different sorts of machine learning algorithms, one trying to trick you or trick the other algorithm which then attempts to determine what’s fake and what’s not. But the the sooner we can orient ourselves towards those sorts of threats, the better we’ll do especially when they’re initially rolled out.

Trevor Burrus: How much concern do you have because Facebook had announced it’s a … Was going to do it itself to [00:34:00] some degree try and combat fake news and Google as we mentioned with Russia Today, does that because it concerned you how Facebook would do it? Compared to the government … If we’re trying to say we need ways of combating this including our own personal behavior, our own sources, checking our own things, using technologies that help combat it or things like Snopes or whatever we’re going to have to do this. Or the government [00:34:30] could do it, which seems to me to be very, very scary‐

Aaron Powell: Just to clarify, I don’t think the government doing it, it would be Facebook coming up with its own way to do it or Facebook doing it in the way that the government tells it to. But it would still be Facebook doing it. You could have the government pass a law that said something about‐

Will Duffield: This brings us to the recently proposed Honest Ads Act and Facebook’s response to it. Amy Klobuchar has proposed a bill that would [00:35:00] regulate online advertising in a similar way to how video advertising on television is regulated with a couple important key differences. One being that it would cover all manner of political advertisements or advertisements oriented towards a national legislative issue of importance which goes beyond simple candidate advocacy. [00:35:30] The bill also includes a provision to have the FCC compile biannual reports on non‐​paid political online advertising which isn’t really defined and could include almost anything.

Aaron Powell: Could that be like a YouTube channel where someone just rants about how much their wealth?

Will Duffield: Again, no one really knows what it means and yet it’s in the bill. When Facebook is attempting to come up with a solution to something like this, it doesn’t include those sorts of extraneous overreaches. [00:36:00] It’s very problem specific in how its created, tailored and as well unlike traditional legislation, they can test and tweak their solutions in an ongoing fashion. After this bill was announced, Facebook announced its own updated advertising policy which dodging the somewhat difficult to answer question of what counts as a national issue of legislative importance [00:36:30] will create a database of all ads run on Facebook tied to the pages that run them.

It’ll require pages that run ads to be linked to real people and this is on one hand a measure which accomplishes at least the spirit of what Congress is supposedly setting out to do but also provides general consumer benefit that you wouldn’t have seen with the congressional bill. If you’re interested in seeing some [00:37:00] things as simple as what kind of ads the sneaker company has been running for the past couple years, whether they’re showing you the same ads they show to other people, Facebook will now give that to you.

I for one am much more comfortable with platform driven solutions mainly coming down to the fact that they aren’t wielding legal power over America as a whole and they know their platforms pretty well. Much better than Congress people.

Aaron Powell: The obvious [00:37:30] danger here is to take this back to what? Trevor said that a lot of the motivation behind the current crusade against fake news is Democrats who lost an election looking for an easy explanation and one that doesn’t point the finger at them. That isn’t, you ran a terrible candidate or people don’t like your ideas that’s, here’s something that can fix it.

Trevor Burrus: No, that cannot [00:38:00] be it. It has to be fake news.

Aaron Powell: Facebook could do this and Facebook could solve the fake news problem as far as like we as users are concerned but the next time the Democrats lose an election or the Republicans lose an election, they’re going to say, it was because you didn’t do enough. You didn’t solve the fake news problem, we’re going to solve it. Then, of course, the legislative things, you make a legislative fix that still … [00:38:30] Even if Congress passes something to ban fake news, it is still likely that Democrats or Republicans will lose an election at some point in the future.

Then we’ll want to crack down even further. To some extent is like the private sector saying wait we’ll do something about it or yes we acknowledge the problem teeing up those kinds of fights in the future because what the private sector has said is a sexually lended credence to this narrative that the politicians are clinging to that the [00:39:00] reason they lost is because of this thing and this thing is actually a problem when it might not really be a problem in the first place.

Will Duffield: I think you’re going to have that no matter what you do because if the current platforms come down too hard either on fake news or on opponents of certain politicians or whoever, those folks will find somewhere else to speak, and they already are to some extent. For a host of reasons, you’ve seen [00:39:30] YouTube demonetizing all kinds of content because advertisers have been frustrated initially because their content was appearing next to terrorist videos but this is now extended to all manner of things. Done reviewers for instance.

An alternative platform is sprung up just to host videos about firearms. You’re seeing with Patreon and a [00:40:00] sort of crowdfunding website which then kicked off certain members of the alt‐​right. They’ve created a competitor called Hatreon.

Aaron Powell: And subtle.

Will Duffield: So, at some point, even if you get a congressional push to regulate something which is rebutted by a shift in private policy, you’ll end up with a competitor which won’t make the same change and will [00:40:30] therefore in the mind of Congress people merit their legislative response anyway.

Trevor Burrus: I think it’s important to point out and this is some stuff I’m currently working on. Part of this problem is tied to the schismatic partisan nature of the country and in particular the inability of partisans to understand why other people disagree with them. To me that is actually [00:41:00] the biggest threat to free speech overall because a while back you had … I can’t which Congress person, I can’t remember exactly but some very extremely left‐​wing member of Congress proposed taking Fox News’s license away under the FCC discretionary rules‐

Will Duffield: Trump wanted to do it with NBC.

Trevor Burrus: This was a very marginal view to say that [00:41:30] Fox News is harming America and they should be just classified as a political organization and that be regulated by like the FEC as a political organization. That’s the kind of thing that really should be terrifying to people if you have citizens out there who think that the biggest reason people are disagreeing with them is because there’s something that’s telling them lies. So, the left has thought that forever about places like Cato, we can be called like [00:42:00] a lie manufacturer by people on the left and so we need to be shut down for the good of America and now the right thinks that fake news is the problem and that’s why we’ve had all these issues for years. So, they need to be shut down for the good of America and then you’re just going to have possibly a very dangerous situation.

Will Duffield: If you have a model of garbage in garbage out, people rather than human beings is kind of news receiving automatons then you’ll always seek [00:42:30] to alter their viewpoints by limiting their access to certain bits of information.

Trevor Burrus: For the good of America. Because it’s not just‐

Will Duffield: And even for their own good.

Trevor Burrus: Your point about voting is perfect because it’s not just shutting down science things because the person got Essential Oil. It’s like, if we’re going to save America by insert policy X, building the wall, single‐​payer health care or whatever, we need to make sure people understand how good it will be for them and the way we do that is we [00:43:00] shut down the liars.

Will Duffield: I mean a big strike against that worldview comes out of this most recent election and the Internet in that you saw the greatest shift towards increased political polarization amongst the elderly and they’re also the least likely of any age cohort to be receiving the majority of their news online so the internet supposedly it is tremendously polarizing force didn’t seem [00:43:30] to most polarize those who spent the most time on it.

Aaron Powell: As far as censorship goes so, if we pulled, if the government pulled a broadcast license in order to block certain types of political speech we would see that as censorship. But do we see it as censorship if the government says hey Google, you better do something and as a result Google radically decreases traffic and attention to Russia Today [00:44:00] or hey Facebook you ought to do something about these liars in the libertarian think‐​tank world and they and so Facebook turns off Cato’s access to its 300,000 plus fans and followers on Facebook.

Will Duffield: I would say so I think to make this example stark you have to look outside the United States because we don’t think of our government as it generally doesn’t [00:44:30] push companies to directly violate the First Amendment. If it did so explicitly, that would be illegal. However, Europe knows no such constraints. In the past year and a half really, we’ve seen in European Commission code of conduct forced with the threat of regulation behind it on Google, Facebook and Twitter [00:45:00] to adopt European‐​style speech norms across their platforms globally through their Terms of Service.

In many cases when using these platforms, we fall under private standards but private standards promulgated by the European Union. Germany has passed the Nets DG Law which attempts to expand Germany’s understandings of hate speech worldwide. It’s an [00:45:30] extra territorial imposition by threatening social media companies with millions of pounds in fines if they don’t take down certain offending content within 24 hours. The very concerning element is that it leads us towards automated takedowns in which you have some algorithm determining what counts as hate speech in Germany and applying it to things you post here.

Trevor Burrus: Bill Hicks is never going be listened to again [00:46:00] in Germany then.

Will Duffield: Amber Rudd, Home Secretary of the UK seems to keep shrinking her expected takedown window from 12 hours to 2 hours to hopefully automatic takedowns. This covers things which are legal here but hate speech elsewhere. Something like Holocaust denial but even the sharing of technical information. Something that could be used to make a firearm or bomb for instance. [00:46:30] So it has a pretty wide impact and as more and more countries come online and have populations such that they can’t simply be turned down and pushed away by the Googles and Facebooks of the world, we’re going to see an ever stranger mesh of competing restrictions on speech coming from Turkey, Nigeria, China’s mostly walled off [00:47:00] its internet but this problem in general is only going to get worse.You’re left with either imposing everyone’s norms on everyone, everywhere which leads you with very little acceptable speech or a more fragmented internet.

Aaron Powell: Or I’ll just pop in here and suggest people go back and listen to our episode on the blockchain from a month or so ago, you end up with a technological shift [00:47:30] where all of this communication is encrypted and decentralized and not accessible to states and not controlled by central entities that can enforce takedown notices. I think that there’s a technological fix here and I think that it would be relatively easy to switch this all over to a system where all of the issues that you’re raising right now about censorship and public private stuff and gatekeepers simply become [00:48:00] technologically impossible.

Will Duffield: I’m sympathetic to that view but I think we need to take the role of intermediaries very seriously here because there have been attempts to do that in the past. Sealand, an anti‐​aircraft platform that sits in international waters off the coast of both the UK and Europe was for a time in the mid‐​2000s used as a data haven. It was thought that in a sort of Cryptonomicon ask fashion, people could store their [00:48:30] gambling server here and therefore it wouldn’t be touchable by the authorities.

That’s all well and good but that traffic flows through a cable to Sealand and then to France and all of the gambling traffic can be parsed out as soon as it hits a server on French soil. So, as long as you still have intermediaries and particularly intermediaries that exist within physical space, [00:49:00] you’re going to have nexuses of censorship nodes in which the government can interfere and impose its will upon cyberspace.

Aaron Powell: We’ll move on because we’re … I had a another question for you but I’ll just say there that I think I’m more optimistic than you are and I think that we already … The concerns that you just raised already have been mooted and we can already do those things without having to worry about [00:49:30] the centralized locations or pipes or identifying traffic. We started this conversation with the way that the digital world changed things. That kind of radical break from the prior way that we thought about speech and communication and interaction . So now that we’ve been in this world for [00:50:00] call it since the early 90s, that’s-

Trevor Burrus: You’ve been in the world since the early, you are very early adopter.

Aaron Powell: We being the world has had access to these kind of global networks‐

Trevor Burrus: I would say 2000 is the beginning of‐

Aaron Powell: Since people began moving online in significant amounts. I remember the wonderful straight to VHS instructional videos in the 90s about how to surf the web that had the person sticking the AOL CD and then be blown [00:50:30] backwards by the awesomeness of the information superhighway. Do you think that in another 20 years, looking back that this shift will have been positive for freedom of speech, for freedom of expression or do you think that it will … We’ll have moved backwards somehow. That the access it [00:51:00] gives to censors, the ability to censor, the ability to control will outweigh the benefits?

Will Duffield: I think on the hold its it’s still going to be very positive. I mean, today for everything we’re concerned about here, we’re able to communicate with people all over the world, share ideas, share papers often in a hidden encrypted fashion. People can speak anonymously on the internet and shed their [00:51:30] whatever persona they have in real life in order to speak honestly in a way that they might not be able to in other parts of their life and that these are net goods.

I am concerned to some extent by the permanent digital nature of a lot of speech, the fact that data right now is being swept up. That we don’t have the processing power to go through yet or it doesn’t make sense to [00:52:00] from an economic standpoint but that can one day be parsed and understood in such a fashion that in many cases social pressure can perhaps be applied to people who thought they were speaking anonymously. Or thought that what they were saying was encrypted 20 years from now it might not be. However, again, on the whole it’s allowed [00:52:30] us to speak to far wider audiences and for individuals who couldn’t have spoken previously to do so, and that’s valuable.

Aaron Powell: Thanks for listening. This episode of Free Thoughts was produced by Tess Terrible and Evan Banks. To learn more, visit us at www​.lib​er​tar​i​an​ism​.org.