Representatives from Facebook, Google and Twitter are testifying before Congress this week about propaganda on their networks. Misinformation on social media has been aimed at creating division and unrest, especially around race. And as Facebook's general counsel said on Tuesday, that's still happening. For instance, fake Facebook accounts created by Russian operatives encouraged violence against Black Lives Matter protesters. Another one called Blacktivist suggested violence and protests against police.
Karsonya Wise Whitehead is an associate professor of communication at Loyola University. Marketplace Tech host Molly Wood talked with her about how online propaganda influences our real-life interactions and how much responsibility social media companies should bear. Below is an edited transcript of their conversation.
Molly Wood: Do you think these accounts were used to exacerbate racial tensions? Do you think that's the case, and did it work?
Karsonya Wise Whitehead: I think that is the case. I mean we're at a moment in time when racial tensions have been heightened for the past year. And I think that anything you add in to stir up the pot, to keep people upset, to further divide people, it's going to be pretty successful at this moment in time.
Wood: And then what about trust in social media? One of the fake accounts that's used as a big example was essentially sort of a Black Lives Matter account that was not real, but had more followers than the actual Black Lives Matter page. How does it affect trust when you think that you're gathering in this online community with people that you want to talk to and then you find out they're not real?
Whitehead: I think that's why this campaign was so successful. Because you saw that people were willing to believe that this was an actual source without doing the extra work. It is heavy lifting to begin to investigate these sources that you look to, to give you the news. You have to find out who is behind this website and what is their actual motive here.
Wood: Do you think that the tech companies have a responsibility to shoulder some of that heavy lifting?
Whitehead: I would like to think that the tech companies would be more responsible. I think that in the World Wide Web, those who have the ability to do oversight are always a step behind. By the time there's an image on Twitter that needs to be removed it's already been retweeted multiple times. But I think that we as citizens, actively engaging in this medium and figuring it out as we go along, we must shoulder some of the weight and the responsibility and ultimately some of the blame.
Wood: There have been hearings this week. What comes next?
Whitehead: One of the things that really struck me is — I asked my students to consider that if Russia has access, who else has access to not just our information, but to the discussions that we're having that are raging across our country? The ways in which they were involved in the last presidential election gives us cause for concern, but it's been almost a year and we're now really trying to unravel what has happened. The next time we have this situation, will it take this long before we figure out what happened, try to understand the impact of it, and try to put something in place to make sure it doesn't happen again. I don't know if the world of technology can move that quickly because social media is so large and it is hard to manipulate when there are no clear boundaries about where it begins and where it ends.