President Trump and Parler Have Been Deplatformed: What are the consequences of giving users the boot?

Episode: S2 E1

Podcast published date:

 

SUMMARY KEYWORDS

misinformation, platform, people, content, tweets, twitter, deleting, data, checking, archive, vulnerability, posts, fact, amazon, important, platforming, social media, internet, user, moderation

SPEAKERS: Shawn Walker, Michael Simeone

Michael Simeone  00:00

This is Misinfo Weekly, a somewhat weekly program about misinformation in our time. Misinfo Weekly is made by the Unit for Data Science and Analytics at Arizona State University Library. 

Hi and welcome. It's Friday, January 29. And today, we are talking about deplatforming. A lot has changed. I think the last time we posted a podcast was December 22. Right before the holiday break. Good thing nothing happened.

 

Shawn Walker  00:30

Yeah, Happy New Year.

 

Michael Simeone  00:31

And what a year. So much has changed between when we were last recording. And where we are now and not just politically, but just in terms of what the social media landscape looks like right now. 

 

Shawn Walker  00:44

Yeah, I agree. One of the big changes is how publicly some of the social media platforms are responding to mis and disinformation. It's much more of an active response versus focusing on specific posts or specific URLs, it's become even more of a conversation in the public consciousness than last year. Do you think?

 

Michael Simeone  01:04

Yeah, I think that's a good way of describing it. So, rather than line editing some pieces, if you go to Twitter right now and search for QAnon conspiracy theories, you can't find any. That's because there's a systematic purging, and censoring of certain kinds of content. Yeah, very different. And many users are no longer present on those platforms, or if they're not present, they're certainly not active. And not for nothing. The President of the United States was banned from Twitter.

 

Shawn Walker  01:31

Now, Former President

 

Michael Simeone  01:32

 Correct, yeah, that other thing changed. All of this kind of gets us to this topic for today, which is deplatforming, because as you said, we're not just talking about blocking certain posts. We're not just talking about banning one account. We're talking about a systematic, deplatforming, where certain kinds of celebrities of extremist political positions end up finding that they're no longer welcome on some of the most popular social media platforms, and the kind of content that goes around like QAnon conspiracy theories or other kinds of related stuff that's not present on the platform anymore. So when we say deplatforming that has a very specific force to it. That's probably being used a lot here in January 2021. But Shawn, technically, what is deplatforming? 

 

Shawn Walker  02:19

So, deplatforming involves removing from a platform specific account. And in this case, we've also seen removal from platforms of specific apps. So Google has gotten into the game. Apple, with the removal of say, the Parler app, for example. And then even Amazon is participating in deplatforming. And basically turned off the Amazon Web Services account, which are all the servers and data for Parler. Amazon flipped a switch in Parler disappeared from existence, that wasn't just a user, that entire platform was deplatformed off of Amazon servers.

 

Michael Simeone  02:57

Yeah, so platform, right as a glossary term, things like Twitter, and Facebook, Google, Playstore, Apple App Store, Amazon, all these different platforms that have various uses, right. Some of it is web hosting, some of it is applications and vending applications, some of it is social media at all these kind of roughly construed our platforms. So censoring is one thing, but deplatforming means you're just off the thing entirely. Because of how much power a lot of these systems have accumulated over time. Deplatforming is a really heavy hammer.

 

Shawn Walker  03:34

And the conversation has expanded. Because in the past, deplatforming and moderation of content, we focused often on YouTube, Facebook, Twitter, this sort of holy triumvirate. And now we're talking about larger infrastructure providers. So you can think of the Google Playstore is the infrastructure for all of the applications that run on Android phones. The Apple App Store is that infrastructure that allows apps to be loaded and unloaded from phones. And then we have Amazon, which provides computing and internet infrastructure for the applications for platforms that run on them. So we now have this deplatforming conversation expanding into these infrastructure providers that have a lot of power that we often don't talk about. 

 

Michael Simeone  04:17

By like this, you know, different kinds of platforms are coming into play here and a shift from content moderation to managing your membership in the platform. We saw a lot of platforms start to regulate in this way. And as you mentioned, Parler disappeared, what was the impact that you observed when Parler disappeared? And a lot of these users, content themes, and applications got variously deplatformed. What did you see?

 

Shawn Walker  04:45

There are different types of gaps that they leave whenever they disappear. So if we go back for one second, we can think of the types of content moderation that we were used to seeing from social media platforms like YouTube, Facebook, Twitter, for example, what they would delete individual posts--

Michael Simeone  05:01

Back in the early 2000s.

Shawn Walker  05:03

So a couple years ago feels, right?

Michael Simeone  05:05

Right. 

Shawn Walker  05:06

Six, six years ago?

Michael Simeone  05:07

We actually have our content moderator on the board.

Shawn Walker  05:10

Yes. 

Michael Simeone  05:10

That was a different internet. 

Shawn Walker  05:11

I want to go back a little bit before Parler was deplatformed by Amazon. If we look at the way that we traditionally would think of, or traditionally, two months ago, we would think of deplatforming or content moderation, we would actually see the platforms delete individual posts, or not very prominent accounts would be suspended. Then we saw the platforms move into placing labels to correct or provide further context of tweets from, for example, President Trump, then they moved to deleting tweets from President Trump's timeline. And then, right before he left office, they suspended his account. So all of President Trump's tweets are no longer accessible on Twitter. And all of those news stories and links from news stories and other sites to President Trump's tweets, they now resolve to error messages instead of those actual tweets.

 

Michael Simeone  06:06

So deplatforming people, or the escalation to the deplatforming just scrubs the record of people off completely. So now the record of the former President on Twitter is no longer part of Twitter. We observed a lot similar things happened to other people who are not nearly as famous or as notable, but the evidence of their existence just isn't there anymore on these platforms.

 

Shawn Walker  06:31

They're removed from the platforms. But the other piece is this breadcrumb trail that leads to the platform from other social media services from other users that link to their...so someone on Twitter might have linked to one of Trump's tweets, that breadcrumbs still exists, but then the breadcrumb leads to a dead end. Same thing with news stories that embedded Trump's tweets, think about the 10s of 1000s of news stories, at least, that embedded tweets from President Trump. And those all are now holes in those news stories.

 

Michael Simeone  07:02

To think about deplatforming like Jenga, in the sense that the last four years, and all of the misinformation woes circulating on all these platforms have been creating this wooden block tower, for better or for worse. And then if you start to pull the blocks out, that structure might not stand up as well. If you're of the perspective that you just want to eliminate that content, because you think it's harmful, then perhaps you're okay with that. But that structure, right, that structure that you need to be able to understand the record that structures gone, that tower collapses. I don't know why I went to Jenga. Actually, I detest the game of Jenga. It's probably like the most controversial opinion voiced on this podcast. But important pieces are missing as a result of the deplatforming.

 

Shawn Walker  07:46

Yes, I would think of in some ways, it's if we're filled with metaphors and analogies today, it's similar to if you woke up one morning and a building in the center of your town had just been removed the driveway still there, all the printed maps included, all the material includes pointers to the building this the building's not there anymore. It used to be, and a more extreme version of that happened with Parler. But Parler was a bit more insular. Facebook and Twitter have a lot of public facing links and public facing views that come in versus in Parler was more insular community that was having discussions within Parler to the effect of that from the outside is lessened. But all those users that moved from other platforms, because they felt they were being censored, whether they were not, that's another topic, but they felt they were being censored. So they migrated to Parler. And now Parler is gone. So now all those folks are scattering to a whole bunch of other services that are even more difficult to participate in or or conduct research about.

 

Michael Simeone  08:49

Well, that's sounds like the second big change between December and January deplatforming happened and really put a bunch of Swiss cheese holes in the record. And I think if you're on the perspective of trying to reduce the harm of bad actors, you may cheer for the idea of people being kicked off platforms. But if you're interested in making sure that you understand or preserve the public record, that it's distressing to have all that information be wiped away completely. But then you also mentioned this idea that people have migrated to different platforms and dispersed to a number of different alternatives. Another thing that's happened is a conversation now, that I don't think is a new conversation, how much control or power do platforms have? And should they have that much power? I don't know if we're gonna get that much into it today. But I just feel like it's important to note when we're thinking about the delta, between December and January, it's not been that much time. And we're already talking about basically archeology of all of these platforms that were within a very small space of time, the refuge for people who are deplatformed from some places, and now, those refuges were also deplatformed.

 

Shawn Walker  09:57

They're moving to other spaces. So for those of us that are interested in studying dis and misinformation. Or for example, if we want to study the, the storming and occupation of the US Capitol, a lot of that took place within these platforms within some of these deleted posts. And it's as if they just disappeared one morning, and researchers no longer have access to that either. So that chunk of the historical record was, like you said, just ripped up and is very difficult to get access to now. 

 

Michael Simeone  10:24

Yeah, I mentioned this a minute ago, but the social media landscape is very different, now. What you can be exposed to on Twitter is different now. What you can be exposed to on certain areas of YouTube right, has changed, now. That's not to say that misinformation and disinformation isn't going to resurge on some of these platforms, but it is going to adapt and find new manifestations. But, I think it's important when we're thinking through this, that as researchers, we certainly feel the loss of this data going away. As citizens, it's interesting to think through the ethics of just deleting information. But then there are some people out there, I'm sure who are thinking, I just don't want the country to be on fire. And so maybe we should just get rid of all of these bad actors on the platforms for a little while. But we're not trying to represent this as an inside baseball affair, where the only thing that matters is the preservation of all this social media data so we can analyze it, we should talk about the advantages and disadvantages of platforming, because we started off by talking about a lot of the disadvantages. But what are some of the advantages to deplatforming?

 

Shawn Walker  11:28

A huge advantage is, at least momentarily, you stop information from coming from that source. So, not to continually use the Trump example. But to use it again. Think about how much we've heard from Trump, since his account was banned from Twitter, it's almost like there was this switch that was turned off like a microphone, in some ways was turned off because the media paid a lot of attention, and gave a lot of credence to President Trump's tweets. And then when they stopped, there was no other mouthpiece that was being used at that moment in time, there was just silence. Deplatforming of a user can temporarily stop misinformation from coming from a certain user. But, eventually over time that finds another platform, another place, another account sort of can go around that. But advantages we can put like a sudden barrier in front of it. And it takes a while for that misinformation to figure out how to go around.

 

Michael Simeone  12:23

Got it. What are some other advantages? 

 

Shawn Walker  12:25

The other advantages are we stopped the circulation of existing posts when removed from the platform, they no longer resolved anything, they just show up as an error message, then that information that was in the account can no longer be displayed, or it's more difficult to find that information. Except for folks that preserved it.

 

Michael Simeone  12:41

Yeah, we just talked about how important it would be to go back and read this stuff. But it's still harmful. That's still bad information out there. Some of those tweets are linked to Reddit posts, or 8Chan posts or whatever someone's going to come back to that. And how do people assemble evidence to make an argument online? Oftentimes by links that are embedded in tweets, and screenshotting them or just having the link to the tweet. And so that's a citation of sorts online. And if somebody is constructing a misinforming argument, and they're using links to Twitter, and YouTube and Facebook as a way to present their evidence, then leaving that stuff up means that somebody can still come to some kind of harm by engaging in it. 

 

Shawn Walker  13:22

There are archives of web content, social media content like the Internet Archive, those aren't as immediately accessible as social media posts from accounts that are banned or posted or deleted. It takes a lot of extra effort to dig into the archive to find those posts to then update all of those links and QAnon forums or 4Chan, or even news stories, then link back to archived content, we have this space of time that we are given by deplatforming a user or removing content, we basically did a little breather to address the mis and disinformation. Before it recreates itself.

 

Michael Simeone  13:58

I really like your point here about time. And I think it's one of the most compelling cases for deplatforming in that one of the things that we've talked about at length throughout these podcasts is the temporal advantage that lies have over fact checking and deplatforming helps even that out a little bit. It reduces the spread of misinformation. So it can't have that network propagated surge in a spread to everybody. This is something that we saw with the second Plandemic film, when it was shut down, or the amount of hits and the amount of shares that it could get on Facebook were severely throttled. That film reached far fewer audience members. I think that was an example, not necessarily the same approach. They weren't technically deplatformed, but it made a big difference because it delayed the onset of the misinformation. And so I think that idea of varying the tempo of any source of misinformation, I think that's really interesting. That's, and here's a riff on your point about time. It's been said by plenty of folks who know an awful lot about misinformation and misinformation studies, that if you kick people off of platforms, they're only go to an exact phrase is darker corners of the internet. Sure, but that transfer is not instantaneous, getting Twitter and Facebook and YouTube to all talk to each other seamlessly across a bunch of other platforms, to, that took a while, and an awful lot of money to be able to create that network. Even though there are dark corners of the internet. There's not an immediate replacement for the platforms that people were just kicked off of. And I think we saw in the last few weeks, just how difficult it can be to re-platform. The second point about time is that Yeah, not only does it delay the onset of the message, but this is the assumption right, it's worth slowing down their short term communication with one another, even though it may very well be that they will go to, as you mentioned, less supervised, less transparent areas of the internet. 

 

Shawn Walker  15:56

It's sort of like a Jurassic Park moment, right. Like life finds away. So in many cases, misinformation finds a way. You can see, as you said, This lack of moderation or changes in moderation, even in the movement from Facebook and Twitter to then Parler, which Stanford Internet Observatory released a study yesterday, looking at Parler only had 800 moderators, and very rarely moderated content, versus Facebook has many more moderators more often moderates content and has a whole process for that. And so then the move from Parler to other services like Gab, and Cloud Hub, and Clapper, for example. They even have less moderators than Parler. So we have to figure out this cost benefit analysis. And sometimes, deleting information can make it feel more juicy and more valuable, as in the case of Plandemic. That when the video was missing, so people kept talking about why it was missing. What are you hiding? Because you don't want me to see that video.

 

Michael Simeone  16:54

Yes, to wrap up the advantages section, it seems like it gives you a time-boosts. And I like the metaphor of just trying to douse the fire, so it doesn't spread, but you haven't really fixed the problem. And that brings us to the disadvantages. And one of them that you mentioned is that it inspires people, when part of your misinformation is about a deep state and tech syndicate conspiracy to harm ordinary people. It's not going to help out the general message to deplatform people. And it's back to this idea of it's easy to think that political polarization is the result of misinformation. And therefore, if we eliminate the misinformation, we won't be polarized anymore, because it's a positive feedback loop. That's technically true. However, back to your point about misinformation finding away and some of our previous conversations about vulnerability, that political polarization is a prerequisite to being very disinformed. And so there's enough political polarization now, that deplatforming is probably not going to be long term sufficient to make sure that people don't connect to one another online around these ideas. And around these theories.

 

Shawn Walker  18:01

That's a really important point about the prerequisites to be misinformed. And it might feel like deplatforming are regulated social media in a specific aspect, then misinformation would disappear. But many of the problems that we're discussing, these aren't new, these have been around ever since humans were able to communicate. And social media puts a specific twist on it. But this isn't a social media problem. Social media exacerbates it in a specific way. So we have to address other problems that don't have a technological solution.

 

Michael Simeone  18:33

Yeah, I think it can certainly prevent whatever information disorder we're all collectively experiencing. Deplatforming can prevent it from taking a certain shape, which is namely high level political figures, and political celebrities using social media as a platform to spread disinformation in broad daylight. Deplatforming seems like a short term fix for that. But that's only one piece of the puzzle. In our conversation with Dan Gilmore not too long ago. One point that he made that I thought was really fascinating is that the way the press narrates any political figure speech helps contribute to mis and disinformation just as much if not more than other forums for this kind of stuff. So just because we deplatform a group of folks, doesn't mean that we're out of the woods, our entire set of communicative and journalistic practices, needs to tilt.

 

Shawn Walker  19:24

The way that we narrate this, the way that we believe that social media posts are newsworthy in certain ways. And the obsession with that, we then have to go back to basic journalistic practices, we have to go back to basic educational practices. There's big debates about information literacy, and whether or not that addresses this problem. There's not one magical solution that's going to take us out of this and even deleting content, adding labels to content, fact checking, like none of these alone and sometimes even combined, don't address a problem that social media just happens to bring to the surface or help bring to the surface in a little faster fashion.

 

Michael Simeone  20:02

Yeah, I agree if we think about this from a vulnerabilities perspective, having these major platforms go largely unmoderated is a systemic vulnerability to misinformation. I think there's no doubt about that. And you could argue that you're mitigating that vulnerability by deplatforming people. At the same time, you're not eliminating a great deal of your other vulnerabilities, whatever social and political and economic climate that we are currently experiencing. That doesn't go away. The practices of news and cable news, the political economy of the media, all of these things that help contribute to some of the communication networks that got us here in the first place. Those aren't going away because of deplatforming. And so is it eliminating one part of a vulnerability? Yes, but I think it's important to see this clear eyed that there's still plenty of other vulnerabilities just deleting people's accounts. It doesn't save us.

 

Shawn Walker  20:55

No, it might buy us time momentarily. But then we also have to deal with the costs involved with buying that time, it's not free to buy that time to slow something down, that has other unintended consequences too.

 

Michael Simeone  21:08

What this situation has done for the strengths and weaknesses. I think an additional thing that happens from deplatforming, is just that you have this data that is out there, that doesn't have a home anymore. Because it's not like people weren't paying attention to Twitter and archiving tweets from key individuals. It's not like people weren't paying attention to Parler and archiving material from Parler. It is a practice to surveil and archive this data. But where does it go? We have lots of orphan data being generated or cut loose in the last three or four weeks, you have some direct experience with this.

 

Shawn Walker  21:45

 We had a similar experience. After the Boston Marathon bombings where especially Reddit was used as a forum to collect data to do internet sleuthing to determine: who was the Boston Marathon bomber; where were they? The vast majority of those claims were wrong and actually harmed some people. We have a similar thing that's happening now, the storming of the Capitol was broadcast live, and recorded live and then posted in thounsands of videos simultaneously. We've never had an event like this. As a result, there's this sort of hoarding that's happening, where there's a rush to collect this data, and then make claims about it. That data spread all over the internet. And so thinking about this as a researcher, and like, how do I reassemble some of this data? So we can understand what were the tactics? What were the techniques? What did participants say they were doing? What do participants say they were doing beforehand? How do they organize who didn't want to participate? So there's effort to reassemble that we have these kind of shards that are spread throughout different places. And then eventually, people aren't going to want to keep them on their hard drive or keep them online or pay for that, and then they're going to disappear. So that's some really interesting phenomena that's happening. That's pretty complicated, I think.

 

Michael Simeone  22:59

Yeah. So Parler ended up sounds like becoming an ad hoc video archive of an incredibly significant political and historical event. Parler, as we know, wasn't particularly sophisticated with its privacy, or data management, people were able to get lots of information, personal information about Parler users, you had mentioned photographs of driver's licenses on file.

 

Shawn Walker  23:22

A couple interesting things about Parler, you had the option as a user to upload and verify your account, you could upload a copy of your driver's license. And Parler wasn't secure to begin with. And then after the Capitol storming service providers that they used for security, identity verification, password changes, text messaging for two factor authentication, they abandoned the platform. So the services deplatformed from the platform, they uncoupled themselves, you could basically change anyone's password in the last few hours before Amazon shut off the lights. And so the service started to degrade. And as a result of that, the data became more accessible for a moment. And there were groups that just downloaded as much of that data. And it wasn't that they hacked into the systems, it's that they are able to get lists of posts, and then archive webpages, other things, but as the platform was sunsetting, in a way. All the security around the platform was sunsetting, making it in a weird way more accessible.

 

Michael Simeone  24:22

This is a fascinating situation where, in a way trying to mitigate some of those disadvantages of deplatforming that we were talking about that, you're getting rid of all of this data, possibly evidence, and it's just ethereal. But in trying to capture and document it. There's no organization that says Alright, here's the practice and here's what we do. Collectively, we haven't thought through that as much. Then I know there are groups and organizations that were trying to get this data. But at the same time, you mentioned just the hosting of this stuff becomes really important. The archiving of this becomes very important. It exposes some of the weaknesses we have of investing so much of our communication and so much of our information into platforms that, at the flick of a switch can just delete everything.

 

Shawn Walker  25:08

Let's consider the case of the videos that were on Parler. So far, there are about 32 terabytes worth, in some say, even 70 terabytes worth of videos from Parler that are floating around the internet right now. Some archives that are hosted on Amazon, some of these archives are hosted on other services, if you wanted to download that entire 32 terabyte archive outside of Amazon, that costs a little over $3,000, in hosting that data cost thousands of dollars per month.

 

Michael Simeone  25:36

You write the code for the scraper, you get it running, you tweet about it, people love you, you post some of the material, before you know it, you've got quite a piece of infrastructure on your hands that you're going to have to finance in some way.

 

Shawn Walker  25:51

There are only certain players in the world that have the internet capacity to transfer this amount of information. And even if you have a fast connection, it still might cost a lot. So, it's this really complicated. And there's only a handful of people that can do this, not the sort of average home internet user.

 

Michael Simeone  26:09

Yeah, I think that's a really important point. When you see somebody on Twitter, or on Reddit or on GitHub, it's easy to see them as the same as everyone else. But it is not uncommon that some of the people who are doing these archiving, scraping, or other kinds of elaborate and important work, not everyone is like them, not everyone has their skills and talents and abilities. The internet has the ability in some ways to make people seem all alike. But it's not common to have somebody who has the infrastructure and skills to just absorb all of Parler and then let alone absorb it, then just keep it hanging around long enough. It is not predictable that someone will A) be able to do all of this, B) hold on to all of it, and C) have a set of enough collectively shared values to make sure that this stuff is useful in the way that people would want to preserve it in the first place.

 

Shawn Walker  27:01

We also don't know what's in some of these archives, because apparently Parler kept content that was moderated. So we can imagine or posit that the content that was moderated on a platform that doesn't often moderate was probably pretty bad. So there's moderated content is also floating out there, which probably includes illegal content and even more unpleasant content than some of the things that people could think of. So that's another problem of where do you store this, there could be legal issues. So, there's a risk to trying to preserve a platform that's been deplatformed.

 

Michael Simeone  27:34

It seems like there's been a good amount of agonizing decisions around deplatforming. And that's one half of the equation. But the other half the back half is, what's the protocol? What's the procedure for when someone is deplatformed? Right now, it looks like the procedure is nothing people just ad hoc try to figure out what to do. But it feels enough people have enough interests and collective stakes, that our decisions about deplatforming should also include some of the downstream consequences of deplatforming. That we would all benefit from a set of even conversations about practices around what to do when an entire platform disappears, or when very important historical, or political figures are deplatformed from key areas.

 

Shawn Walker  28:20

Well on these platforms. And then the infrastructure that sits under them are transnational actors in a way. Amazon, Facebook, Google, Apple, Twitter, they're not constrained solely by one country's governments. Their policies are supranational in a way. And they transcend. So in Germany, some content is restricted. For example, if you do a Google search in Germany, it doesn't display Nazi content that they've noticed. But in the US that content is accessible, because it's not against the law in the United States. And the decision of Google on what to remove from its index, the decision of Facebook and Twitter what to remove or Amazon, what companies to kick off their hosting service. They have more power than some governments do now to determine what content we can see and what we can't see.

 

Michael Simeone  29:11

Yeah. One final point for me on deplatforming is to think about deplatforming and an overall picture of misinformation. We talk a lot about how people are exposed to tons of information all the time. And the pace of news. And the pace of news through social media is so intense. The amount of news in a news cycle now is incredible compared to what that was decades ago. But one of the side effects of having a lot of that information, not all the side effects. One of those is that it's easy to forget, it's easy to lose track of stuff; and that stuff still happen. And so deplatforming, as we mentioned, does mitigate some things. But it does injure certain capacities to collectively remember things that are important to again, not believe everything that they say. Not absorb it or expose people to it all the time. But to maintain a record of things that happened seems instrumental to making sure that we don't have some of the confusion and amnesia, that is a precondition for a lot of misinformation tactics.

 

Shawn Walker  30:21

There are consequences for when it goes wrong. There's times I think we could argue it goes right other times it goes wrong, because Tarleton Gillespie has a great book about content moderation called Custodians of the Internet. And there's one example that he uses where Facebook banned images of women breastfeeding in an instructional breastfeeding Facebook group, because breastfeeding happens to show pictures of breasts and nipples. And all that content was removed. There was a petition in Facebook reconsidered and realized it was not pornography nipple does not equal female nipples, actually, because male nipples were never moderated. And so that community wasn't there. To show porn, that community was there to instruct other women on how to breastfeed and have discussions about that, which is a really important peer to peer community around breastfeeding. This moderation and the platform is all a double edged sword because we're figuring this out as we go along.

 

Michael Simeone  31:17

With the latest episodes and deplatforming indicate to me more than anything else is this figuring out as people go along, because there are plenty of downstream consequences to this that it took this long to have deplatforming, is very interesting as well. None of these social media companies are public utilities, they are private companies, when activity produces revenue, that is a vulnerability to misinformation, whether it fits our economic ideals or not.

 

Shawn Walker  31:48

 Connected to profit, you can see that these content moderation decisions are also very sensitive to political power. So whether you are a Trump fan or not, some of Trump's tweets clearly violated Facebook and Twitter's Terms of Service, and any other user on the platform. Any other user would have had that content removed, and eventually have that account banned. And notice that this banning activity did not take place until the end of President Trump's term. This is also a political process too.

 

Michael Simeone  32:18

"I would agree with you, but all this stuff you're saying about what the President tweeted, I don't believe you. And furthermore, it's not on Twitter anymore. So I don't have to believe you."

 

Shawn Walker  32:29

I can give you some Internet Archive links. But yeah, so that's also part of this amnesia, right? As part of the conversation, the historical record has changed by disappearing. How do we talk about this now?

 

Michael Simeone  32:40

Yeah, well, even harder to talk about.

 

Shawn Walker  32:42

I If I can't point to a tweet so that we can then have a discourse about what happened around a presidential proclamation on Twitter on January 2, that's harming our ability to have political discourse around anything that Trump ever tweeted. It's just, it's a mind boggling.

 

Michael Simeone  33:02

I think you and I agree that fact checking should be part of a repertoire of strategies for, you know, mitigating disinformation. I feel like that's not a controversial perspective on its face. I feel like you and I are much less invested, in fact checking. But would you draw a distinction between fact checking and making sure that there is a documented historical record?

 

Shawn Walker  33:22

I would, yes. 

 

Michael Simeone  33:23

You know, what I'm asking you next, which is say more?

 

Shawn Walker  33:26

Yes, I see a difference between the historical record and fact checking, they're not one in the same, because we can often do fact checking without the historical record, because there are ways to create fact checks that preserve some of that record that doesn't have to be live on the platform. Did I wiggle my way out of that?

 

Michael Simeone  33:42

I'm not sure if I got your point entirely. Because both perspectives feel like facts matter. And the historical record is there to fact check to say this happened. As a matter of fact, it's there to reinforce and ensure that we can't just deceive people about a narrative that's instrumental to our collective identity.

 

Shawn Walker  34:02

But there are ways to preserve as part of fact checking certain chunks. And they're also if you trust a source of fact check, for example, then their documentation of something that happened in history doesn't require it to continue to live on the platform. The problem is, if they are within the same, then that basically means anything you fact check can never be deleted from a platform or disappear off the internet. And that's just not possible to happen. So those two can't be aligned exactly.

 

Michael Simeone  34:31

Yeah, I feel like fact checking is a bet on the present. And chronicling history is a bet on the future. Another way of saying this is I think one of the most important things about fact checking is that it produces a record of the fact check, not necessarily that it's going to disabuse somebody of an idea in real time. But that historical record means that later, there can be a collective project, of telling that story in a way that isn't just completely contaminated by misinformation. And so by making sure that we chronicle the past, digitally, and making sure that we capture these ephemeral pieces of digital information, we're betting on our future selves to be able to arbitrate and remember, and reconstruct that story. That's not the same as say, Alright, I showed you the facts. And so now you you don't believe... that that's not the case. Right. And it's never going to disabuse everyone of all the bad ideas, but the collective project of, kind of, have a suitable history of a just history, those things can't happen. Without that chronicling even if disabusing of ideas in real time is something that we're both pretty skeptical of.

 

Shawn Walker  35:42

I think that's a great way to put it, that collective remembering is different than the act of fact checking. They're not one in the same.

 

Michael Simeone  35:47

Yeah, although related, I think we could make a case for fact, checking only because you need to document the lie. And I think newspapers have been pretty meticulous about this. I am grateful for all the fact checking that goes on, and not trying to say anything bad about the practice of fact checking. I just think the significance of it is much broader than say, just convincing people that this one particular point is wrong. The folks who do this kind of stuff all the time, this isn't news. But I think when people try to readily think about quick solutions for misinformation, that fact checking, I think is a long game, not a short game.

 

Shawn Walker  36:22

One fact checking, looks back on the history. And so isn't necessarily that delay means that we need other tools, besides fact checking. And as we've discussed before, so they're connected, and then fact checking becomes part of the collective remembering, in a way. 

 

Michael Simeone  36:41

Yeah, I agree. 

 

Shawn Walker  36:42

So it's complicated, but we can do fact checking without preserving all of the content that we don't want to circulate anymore.

 

Michael Simeone  36:49

Yeah, what a wild series of weeks, then politically, and historically, there's an earthquake technologically in terms of policy and practices, there's been an earthquake, and then in terms of data and ephemera that's now just cut loose and people are scrambling to collect it. That seems like a really significant event, too. And that was just a handful of weeks where all of this went down. It'll be very interesting to see going forward, what the misinformation landscape and what the misinformation networks start to do and how they transform now that some of this deplatforming has happened. And we didn't even get a chance today to talk about QAnon and what happens to all this stuff. But people I think at this point in the summertime, it was cool to talk about QAnon. But now I think everyone has heard plenty about QAnon for now. But it'll be interesting to see what happens to some of these things that as you mentioned, if this misinformation is going to find a way, what are the new ways that are going to emerge as adaptive strategies?

 

Shawn Walker  37:43

So it's Jurassic Park 2, the sequel?

 

Michael Simeone  37:45

Hopefully, these adaptive strategies will be more entertaining than Jurassic Park 2

 

Shawn Walker  37:51

You're giving...you're ragging on Jurassic Park 2 but we'll see. We'll see how information finds a way next time.

 

Michael Simeone  37:56

Am I down on Jurassic Park 2? 

 

Shawn Walker  37:58

It sounds like it.

Michael Simeone  37:59

I'm rating it too low? I just don't think it was a very good sequel.

 

Shawn Walker  38:02

I was okay with it. How's that?

 

Michael Simeone  38:04

Hopefully whatever it is we observe is more intricate and stimulating than Jurassic Park 2. But that's just me. Thanks. Thanks for joining us today. Looking forward to a new year in 2021. Be safe, and be well. For questions or comments, use the email address datascience@asu.edu. And to check out more about what we're doing, try library.asu.edu/data

Series name
Misinfo Weekly