Rittenhouse on FaceBook: How disinformation and white supremacy go together

Episode: S2 E10

Podcast published date:

Thu, 11/18 11:31AM • 30:35

SUMMARY KEYWORDS

misinformation, facebook, content, platform, people, white supremacy, white supremacist, rittenhouse, censorship, search, recruiting, feel, groups, problem, important, meta, removing, recruitment, organizations, term

SPEAKERS

Shawn Walker, Michael Simeone

 

Michael Simeone  00:00

This is Misinfo Weekly, a somewhat weekly program about misinformation in our time. Misinfo Weekly is made by the Unit for Data Science and Analytics at Arizona State University Library. Hello, and welcome. It's Friday, November 12, 2021. And today, we're going to start by talking about your favorite platform, Shawn. We're going to start by talking about Facebook. 

 

Shawn Walker  00:22

Yay, Facebook. 

 

Michael Simeone  00:23

I mean, yay, Facebook, because I feel like it's increasingly becoming more and more generational. I think we're one generation away from people not even knowing what it is. Mostly because they'll call it Meta. But also because I feel like it's becoming relevant to a particular generation, but not another. Is that too much of an armchair commentary?

 

Shawn Walker  00:43

That's a little bit "get off my lawn." Yeah. We're a little... We're not that old yet. 

 

Michael Simeone  00:49

Do young people still use Facebook?

 

Shawn Walker  00:52

They do. But it's not as pervasive as other platforms like Snapchat, Instagram, and such. But they still, you know, that's one way that a lot of youth keep connected to their older members of their family and their older friends.

 

Michael Simeone  01:06

Okay, so I mean, this is my kind of trolling way of actually teasing out like, what's the relevance of Facebook anymore? Do you feel like Facebook is still something we ought to pay attention to as misinformation researchers, just because it's for the olds? Isn't it still important?

 

Shawn Walker  01:24

I mean, I think saying it's for the olds is a little bit of hyperbole. But it's very important because Facebook still has a huge network. Why did we join Facebook? We joined Facebook because others that we interact with have Facebook accounts, or groups that we want to be a part of have Facebook groups that we want to participate in. So that network effect of having that large network, that's why everyone's joining, and that's why they're still there. Even though we collectively complain about Facebook, it still has a huge network and a huge audience. 

 

Michael Simeone  01:52

Yeah, I think it's remarkable how often, you know, misinformation researchers are very keen to how important it is to pay attention to Facebook. I think a lot of other folks, though, I think are sometimes, feel like they can dismiss Facebook, or even Facebook users, right? And chronically complain about Facebook, but do not leave Facebook. So that's a really intriguing phenomenon. 

 

Shawn Walker  02:13

Well, and we should be specific here. Do you mean Meta the company formerly known as Facebook, kind of like Prince? Or do you mean, Facebook, the platform?

 

Michael Simeone  02:21

Yeah, Facebook, the platform.

 

Shawn Walker  02:23

Because Meta, right, has a number of social media properties. Right? We have Oculus, which is becoming, you know, fairly popular. We have Instagram, we have WhatsApp, we have the actual Facebook platform itself. So Facebook has a huge reach and is continuing to expand that reach over time. Especially as we move, the visual becomes more central to the information that we share.

 

Michael Simeone  02:47

Right? You know, in fact, Meta is just in a constellation of different, you know, pathways for misinformation. Misinformation gets passed on WhatsApp. Misinformation, as we talked about, gets passed on Instagram. I, I shudder at the thought of what misinformation Oculus content would look like. But I don't doubt that it could happen.

 

Shawn Walker  03:10

We're just starting to do (not we, as in you and I) but academia, in general, is just scratching the surface of research in, you know, gaming platforms and other things. And that happens often. I mean, there's disinformation, not necessarily in the political way, or the type of disinformation we talk about. There's disinformation in gaming platforms where you want to disinform another player so that you can have an advantage. But there's also recruitment, like once you join guilds and groups, they can recruit you for other purposes, not just in the game, but like white supremacy and other hate groups can use some of these tools for recruitment. 

 

Michael Simeone  03:48

Yeah, and I think it's, I think you make a good point there, right. If it allows you to connect with another person, it can be an avenue for misinformation. We're talking about Facebook to start out. Well, here's what's interesting about Facebook to me today. Is, per the comments of some other folks that I had seen, I tried searching for the bigram "Kyle Rittenhouse" and I got no results. Now other people were complaining that this is surely the sign that Facebook is interfering with free speech. Why, why is "Rittenhouse, Kyle" or "Kyle Rottenhouse" a popular search as ways to evade this search block? Why, why is Kyle Rittenhouse blocked on Facebook right now?

 

Shawn Walker  04:32

So Rittenhouse has become a very popular figure within the white supremacist community. So we can see that there recently were pictures of him posted online where he has gone to bars and other establishments taking pictures with other figures in the white supremacy community. So he's become kind of a celebrity and a tool for recruitment because of the trial. And because of why he was at the protests.

 

Michael Simeone  05:00

So he's become a kind of celebrity. And celebrities are good for recruitment. He just so happens to be a white supremacy celebrity. And that's great for white supremacist recruitment. So if you're Facebook, why do you ban these kinds of searches?

 

Shawn Walker  05:16

There are a whole host of reasons why Facebook wants to decrease access to some of these keywords. And, you know, one is that this is offensive content that can surround some of these figures. And that, that goes-- it violates their terms of service. Another is that Facebook has received immense amounts of pressure from regulators, from politicians, from industry organizations, and think tanks that are doing analysis and saying that Facebook has been central in the spread of some hate speech and the spread of white supremacy organizations. So they're basically trying to, for moral and business reasons, get away from that content and not allow it to circulate and those folks to recruit on their platforms.

 

Michael Simeone  06:00

Yeah, that really doesn't come up when you just get the empty search page, and it says, "Sorry, there were no search results." It doesn't quite communicate in the same way. Like, "Sorry everyone, looks like we are complicit in some things that we're not comfortable with. So we have to shut this down."

 

Shawn Walker  06:15

Right? Like we, you know, they're not going to say like, "Hey, did you mean white supremacy question mark," right? Or would you like, "Would you like information on organization that might help you get out of white supremacy?" That pop-up, Clippy's not going to appear on Facebook, besides the fact that Clippy is from Microsoft, but Clippy is not going to pop up and be super helpful here, right?

 

Michael Simeone  06:33

I mean, that's...Having Clippy arrive at an opportune time to help you get out of a, you know, political radicalization spiral. That's a Metaverse reality I can get on board with, but you liked my pun? 

 

Shawn Walker  06:46

Yes, yes. 

 

Michael Simeone  06:47

Yeah, that's pretty good, wasn't it? Yeah.  All right. Well, okay. So if we're talking about --

 

Shawn Walker  06:51

Groan

 

Michael Simeone  06:52

You know, look, I --  It feels like a bit of a blunt, blunt force tool to censor all searches for Rittenhouse. Right? When you, you might imagine that not every mention of this person is going to be- become poison, right. Or I can certainly entertain an argument where someone might say, "Look, if you just pretend that this person doesn't exist on Facebook, that, that can do some harm." So it feels like a blunt force application of a censorship policy. You know, now that now, you know, whether or not that's, you know, better off in the long run one way or the other, right? I won't comment on that. Right. But it feels like they're really backed into a corner about this, you know. It's recently become pretty popular to talk about Facebook and misinformation, in that John Oliver did a piece maybe about a month ago about the kind of repertoire of fact-checking or misinformation combating staff that Facebook has available to it. And it's just, you know, doesn't really feel like it rises to the occasion of the problem. So it's no surprise to me that the wholesale blocking of specific search terms has arrived. 

 

Shawn Walker  08:03

Well, first, I would push a little back a little bit on your term censorship. I mean, I agree, like writ large, we can use the term censorship. But oftentimes, folks discuss social media platforms like Facebook, with the term censorship, implying that Facebook is violating their First Amendment rights. But Facebook is not a government space, you don't have a First Amendment right within the platform. The company has the ability, legally, to decide what content can and cannot be on its platform.

 

Michael Simeone  08:33

Yeah, I think that's an awesome point. And I think it indicates why, you know, what vocabulary do we have to talk about this kind of inhibitor for a search? Right, I think, again, you make an awesome point, right? Like, my, my use of the term censorship is, I won't say it's lazy, right. But it's ergonomic. Right? It's the term that came to mind based on, you know, what appeared to be like a convenient way to describe the problem. So if censorship is vernacular? What is an alternative term to describe this kind of thing that we've talked about just now? Right? What do we say, like is it they just blocked it and that's that?

 

Shawn Walker  09:11

So we've talked about this a little bit before in previous episodes, that this function is called content moderation. So Tarleton Gillespie, who's at Microsoft Research at New England in Cambridge, Massachusetts, has written a book about this, "Custodians of the Internet," and discusses content moderation as a process and what the options are. Right. So one option is content removal. Another option would be sort of fact-checking. So whenever you encounter content, it provides an information box. And another option is to blur the content and put a warning and then require someone to click to make sure that they really want to interact with that content. But the problem is, like many of those ways are super brittle. As we've talked about, for example, with the Plandemic videos, sometimes this removal of content just makes the content a bit more spicy for everyone. They're like, "Well, what doesn't someone want me to see?" And so that can cause some of the content to go viral in other spaces. Or people describe that content in ways to try to get around content moderation, which can be automated process, can also be a human who is looking at that content, or it could be that users report it. But all those ways, really are subpar. They're just, they don't work very well. 

 

Michael Simeone  10:24

Yeah. 

 

Shawn Walker  10:25

And it's a Herculean job. Not to get Facebook, the platform, or Meta the company, or Google, or Parler, or Apple, or any of those organizations out of this quandary. Right. But it is a difficult, wicked problem.

 

Michael Simeone  10:37

Yeah. And I, you know, I, I'm interested in my own slip here, because I have this hunch that describing something as censorship is a way of talking about fairness in code. That, I think when people describe, you know, where previously what I use the term censorship, right, I'm kind of paraphrasing other people's opinions that I've read on Facebook and other platforms that describe this kind of conduct as censorship. And I think censorship is kind of like a code word for unfair removal of content, or not above board conduct, as it pertains to the media that you consume, where censorship is considered to be something that is never a good thing. And so it becomes a language about fairness, and where you almost feel like you've been wronged, because you can't get to this content, or, you know, put a different way, something shady is going on. And I say this because, you know, I feel like it's easy for Facebook to give people the wrong idea when you search for a term, and it just pretends like the term doesn't exist. And that, as you mentioned, right, that form of moderation, I feel, does not really instill a ton of trust. 

 

Shawn Walker  11:54

It depends upon the group, right? If you're against that content, then that would instill trust, right? What's your relationship with the content and the organization? So, for example, if you're someone who is -- Say, for example, you're African American, or Jewish, or LGBTQ, right, removing hate speech, that's probably going to endear you to the platform more, that's what you, you don't want to encounter that content. And you don't want to see it circulate. Versus if you're on the other side, if you're a white supremacist, if you're against some of those types of individuals, then you're gonna have a problem with that content disappearing. I mean, then, I mean, there's always the middle ground, right? Some folks are, support free speech in those specific ways. But the issue is, the way that we talk about this really distorts the conversation. Instead of talking about the content, the harm that it causes, the potential options that we have, instead, a lot of the louder voices just start to scream like, "You're violating my First Amendment rights to say anything that I want." And that's just not true. You can't say anything that you want in this space. And that's a bit of a false argument, but it sucks up all the oxygen in the room.

 

Michael Simeone  13:02

Yeah. 100%. I think where I'm going with this is just that, you know, I don't think that, you know, Facebook necessarily has to placate people who are, you know, trying to use the platform for nefarious purposes. That, that's not what I mean. But I think, you know, Facebook is already an organization that people don't trust very much. And I think even being someone who doesn't want to see white supremacist agitation and recruitment on Facebook, I would still rather see a slightly more lengthy and honest explanation appear when I search for terms, and not have them just be gone, or have it be ambiguous with there's no data here, because I know there's data there. And I think it's important to signal when the platform has made a decision, versus when the platform has not. And I feel like this is the thing that bothers me personally, right? Not the people that I saw writing on Facebook, or any the other kind of posting about being aggrieved or whatever. I don't, that doesn't matter right now. What I mean is, is that I feel like it's less useful to confuse no data for "we made a decision." And I realized that the message that I accessed today might be different from a message that appeared three months ago, or four months ago, or a year ago, because it's not like you've been able to search for this term for a while now on Facebook. But making it ambiguous with no data, I don't feel like that serves anybody except Facebook, which doesn't underscore that the platform made a moderation decision.

 

Shawn Walker  14:32

Well, let's devolve for a moment, maybe into an extreme example. Let's move away from white supremacy. And let's talk about child pornography. A topic that we might have more agreement on, not just you and I, but I mean in society in general, child pornography is bad, it harms children. The circulation of that is bad, we want it removed. So how, for example, is this case of, in your example, the white supremacist content. How does that compare to the discussions around, say, child pornography and that removal? Like, there's a like wide agreement. We want it gone. 

 

Michael Simeone  15:08

Yes. 

 

Shawn Walker  15:08

And people aren't saying, "well, child pornography is free speech."

 

Michael Simeone  15:12

Oh, yeah, no. What I'm saying is, surely there are Facebook posts that mentioned Kyle Rittenhouse. It's just that the search does not return them. Facebook, it's my impression that Facebook isn't removing any content that mentions Kyle Rittenhouse at all, it's just that it doesn't appear to come back in a search. And so this is where I'm, this is where I'm coming from is that if the search says, "Sorry, there are no results," then it feels like there should be some kind of explanation that says, "We have actually removed all content having to do with this from our platform," or "We are disabling search for people being able to access this kind of content." Or, actually, "There's no data about this at all. No one ever uploaded any of this stuff to Facebook or even tried." That feels like an important distinction. Right? It feels important that if someone searches for something like child pornography, if you get a message from Facebook that said, or a results screen that says, "Actually we made a moderation decision that this has no place on Facebook." What I'm looking for there is just some kind of explanation about the moderation practices that is part of your user experience. That really matters to me about trying to figure out what are you getting, when you're, when you're agreeing to use a platform like this and trading your social activity for access to other people's social activity.

 

Shawn Walker  16:34

I mean, what you're getting on all these platforms is very little transparency. Google does the same thing. They, they shift searches and shape them for a whole host of reasons. Some, we might argue are very, very legitimate, others are business reasons, or policy or PR reasons. So Meta on its platform, Facebook, is not unique in this. Twitter, same thing. A question would be, how do we explain that? And what is the psychological effect of explaining that tagging? Right? If you say like, there's no content that was returned. That's one thing. But what if it's saying, there's no content because we're concerned about white supremacy? Now the person who searched for that has been tagged by the platform in some way, right? There, the platforms have communicated to them, well, you were searching for white supremacist content, weren't you? 

 

Michael Simeone  17:22

Yeah. I mean, maybe? I mean, I think there's, there's ways around this. And I certainly understand the point that you're making. And I agree, right, that there's a fundamental lack of transparency in all of these different things. That's part of what makes them usable, is that, that we don't get all the data all at once. But I think, you know, for me, this, the decision to remove this content, either from, from search or just from the platform altogether, is one really high profile way for Facebook to just kind of dodge scrutiny.

 

Shawn Walker  17:55

Well, I agree, and I'm not just trying to be a "well what about this?" And you know, be as--

 

Michael Simeone  17:59

 Sure

 

Shawn Walker  18:00

--my students who would be concerned like, "You're a professor, you don't answer questions, you just ask more questions." 

 

Michael Simeone  18:04

Yeah, I know you're not trotting out some kind of like bad faith, slippery slope argument. I, I know that.

 

Shawn Walker  18:10

But I, but I think that one of the things, I think it's important is to really talk about how complex the problem really is, right, is that we have, as you mentioned, really blunt solutions. Right, like I joke with my students, when I'm giving lectures about this, like, a lot of these content moderation tools are like removing an appendix with a chainsaw. Right? Like, it's, it's just not, it's not a good tool for the job. But guess what? We were all out of scalpels today. 

 

Michael Simeone  18:38

Right. 

 

Shawn Walker  18:38

And all we've had are chainsaws --

 

Michael Simeone  18:40

 Scalpel's not been invented by society yet, so. 

 

Shawn Walker  18:42

Yes, we don't have that, that version. I mean, although people in AI and machine learning might argue that they do. I would argue that they really don't. And I'd love to have some conversations with them. 

 

Michael Simeone  18:52

Really? 

 

Shawn Walker  18:53

Yeah, shocker, right? 

 

Michael Simeone  18:55

Yeah. 

 

Shawn Walker  18:56

This is a very interesting choice by the platform, where it's not just harmful content, right? Not allowing folks to search for Rittenhouse means that now you can't search for news articles about Rittenhouse. There's a whole bunch of non white supremacy content, right? There's this really complex information ecosystem that's surrounding this trial, surrounding Rittenhouse himself, surrounding the protests. And as you're saying, right, just cutting that off and saying there are no results, whenever we know there are a bunch of results that are being posted all the time, especially about Rittenhouse, is a really interesting decision. And they're kind of trying potentially too hard to play Whack-a-Mole.

 

Michael Simeone  19:39

Yeah, this is, this is, this feels like cauterization, where we want to get as far away from this issue as possible. And as you said, yeah, news and information and videos could be important. But you know one thing I think that's, that's interesting to think about, you know, if we, if we flip this around and think about why would Facebook do this, right? I don't want to conjecture too hard, but it helps me bring up a point. You know, on the one hand, it's certainly face-saving to just kind of remove yourself from this issue altogether. But I think there's a, there's a, there's a potential benefit here in removing access to mentions of Rittenhouse, right, especially because it's not censorship, because you're not interfering with free speech. And I think that's an important distinction, right, that you've brought up. Another effect, and I can't say if it's purposeful or not, right, is it actually guards people against misinformation. That if there's a subject that is highly associated with white supremacy, and you, you know, scrub that from your users' eyeballs, that also protects people from misinformation.

 

Shawn Walker  20:41

I mean, yes, that's true. But recruitment techniques, misinformation, all those things are very complex, as we've talked about, their different techniques. And a lot of these techniques are not obvious to, like, run of the mill folks. If you're not looking for it, it's going to fly by and you're not going to see it in many cases. Because the platforms have-- The platforms are good at removing, like, obvious pieces of misinformation sometimes. Like, and they're good at, like obvious, like things that are really clean cut, like, we have a lot of fact checking for this, we can find those things. We have filters for, like, "Hey, would you like to become a white supremacist? Click here," right? Like things like that with big neon signs that platforms are good at, and we can use a lot of the basic techniques that we already know about to address them. But recruitment for white supremacy and other hate groups, they use really sophisticated tools that, just shutting down a search, they're gonna bypass that pretty quickly.

 

Michael Simeone  21:42

Yeah, and I think this is where the rubber meets the road about why this particular, you know, quirk, feature of Facebook is important for our conversation about misinformation. Is the coincidence of white supremacist recruiting with misinformation is that-- you know, we have seen, in samples of, of Twitter data and samples of Parler data, especially ones that tend to be, you know, politically radical, that are an opportunity for white supremacist organizations to recruit. Is painting a picture of the world using misinformation, that civilization is ending, right? And this is, this is the Rittenhouse incident, right? Is that surrounding this as the rhetoric that's, you know, our society is crumbling, that the old way of life is falling apart. And, you know, that is grist for the mill for white supremacist recruiting. And so, there is a connection between content pertaining to white supremacy and content pertaining to misinformation. And together, they create, you know, this kind of environment where people are emotionally outraged, they're vulnerable. They feel like a way of life is ending. That's a really great opportunity for further political radicalization, right? Those, that's the subtlety behind some of these recruiting techniques that you mentioned.

 

Shawn Walker  23:03

Right. And if you look at some of the work that's done by groups, like the Tech Transparency Project, some reporting by The Wall Street Journal, a number of congressional hearings, Facebook specifically, but Facebook is not the only game in town -- or now Meta. It was Facebook under the -- We'll eventually get the name right. But it was Facebook at the time of the, of testifying in the reports, now it's Meta, but Meta was slammed for this recruitment, like the Wall Street Journal reported on internal Facebook research from 2016 that found over 64% of people that joined white supremacist Facebook groups, those groups were recommended by Facebook's algorithms.

 

Michael Simeone  23:48

Awkward.

 

Shawn Walker  23:50

That's a definitely awkward report to go public. And we see from reporting, like the mentioned Tech Transparency Project, that many of the larger white supremacist groups have a whole constellation of Facebook groups. They kind of have main Facebook groups, and they have smaller ones. But the main groups are not as interconnected, because that would help you find all of them, right. Because if you once you find one, then if they're all connected, you can basically stomp them all out. But they found these groups still exist. They still have web pages, they're still putting up memes, they're still inviting people to join their Facebook groups. But that's not where their main activity happens. Like once you come into the Facebook group, you're kind of slowly converted, and then you drop into all of the back channels that they have on other platforms, like WhatsApp and Signal and email and you know, other things. Right? So Facebook isn't the nexus of activity, but Facebook is a great place for people to take that on-ramp, right, onto joining a white supremacist group for whatever reason. Or a nativist group that are anti-immigrant, right? So we talk about, like in Arizona, we're right by the US-Mexico border. So there's a lot of rhetoric about immigrants coming in, or as they call them, quote unquote, illegal aliens, right? They're coming in, they're taking our jobs, they're bringing in COVID, all these things. So then white supremacists and nativist groups can then piggyback on top of that to bring someone in, and then pop them into their kind of larger network, once they've recruited them using tools like Facebook.

 

Michael Simeone  25:22

Yeah. And, you know, this point you make about organizations who are fundamentally conservative and wanting to keep things the way that they are, lend themselves really well to environments where there's misinformation circulating, to the effect that everything is changing.

 

Shawn Walker  25:46

Especially in the current political environment that's polarized. I mean, the political environment's always been somewhat polarized. But now, some of the rhetoric that wasn't acceptable to say in public discourse is now acceptable to say on public discourse. And that's a huge shift, so then that piggybacks on top of "the world is crumbling," "it's not what it was," right? And that's a connection to a lot of folks that might be struggling, right. As we talk about, misinformation is useful. So your vulnerability might be connected to whatever your current life situation is, or you might feel scared, nervous about the changing environment. And then white supremacist groups, for example, can take advantage of that as an on-ramp into joining their ranks.

 

Michael Simeone  26:31

Yeah, that's right. There's a kind of empowerment that you can reclaim. And I think that's important, you know, to -- That we don't treat all misinformation as exactly the same. That there are some kinds of misinformation where the idea is to make you feel confused. There's some misinformation whose job it is to make you feel smug and secure. Right? We actually saw this in some misinformation sources that we looked at that were targeted towards the political left, where the idea was to kind of incite complacency. Right? But on the other side, right, there's some misinformation whose job it is to make you feel like the world is shifting irrevocably. And it's important to keep in mind that like, misinformation has epochs, right? There are things that repeat, but depending on what time in history we are, there's going to be a different flavor of misinformation, different channels of dissemination. So especially now, you know, white supremacy and certain conspiracy narratives that are kind of circulated by misinformation, they cooperate very well together. And so this is, I think, this is an important point, right? That when you eliminate the Kyle Rittenhouse search term or content, then a whole ton of misinformation also gets cut off as well, because these two things are baked together in many situations.

 

Shawn Walker  27:50

And that can also become a recruiting tactic, right? Like, what don't they want you to know? Now, you can't find that on Facebook. So now you're going elsewhere to potentially find that information. And groups can take advantage of that for recruitment. And we're using the example of white supremacy, but this can be used by other extremist groups. And we definitely don't mean that all conservative groups are white supremacist or an on-ramp. We're talking about extremist, heavily far right groups, heavily libertarian groups that are extremist that are connected to this.

 

Michael Simeone  28:24

And yeah, I think that really captures what's so tantalizing about being a misinformation researcher, is that on the one hand, you can tell that misinformation causes many harmful outcomes for consumers and for society. But on the other hand, we also realize that it is not the only field of intervention or of change. And that misinformation is also an index of other societal things going on where, you know, yes, we can eliminate misinformation about racism. And yes, we can stop recruiting around white supremacy. That won't stop white supremacy. And that won't stop recruiting. It's just one intervention to slow these kinds of things down and prevent their spread. The information we see online is, right, it's a nice reminder that we are in dialogue with things that are not on platforms.

 

Shawn Walker  29:16

Correct, your relationship with different groups that you have. You could be on old school mailing lists, get magazines, I mean, it's not a new problem. But misinformation, right, is almost like this card catalog of a lot of social ills. And misinformation often can be connected to a lot of social problems. And misinformation piggybacks on top of that as a tool to make it more susceptible, to make it a little bit sexier, to make it easier to recruit or to kind of interrogate vulnerabilities in folks.

 

Michael Simeone  29:51

Yeah, so just like you can't have a campfire on a dry summer day, can't share Kyle Rittenhouse content on Facebook in 2021. Not a good idea. 

 

Shawn Walker  30:02

Oh, okay. Alright.

 

Michael Simeone  30:05

And on that possibly shaky analogy, this is a good place for us to end. Yeah. So thanks for joining us this week. Be thoughtful and be well. For questions or comments, use the email address datascience@asu.edu. And to check out more about what we're doing, try library.asu.edu/data

 

Series name
Misinfo Weekly