talkRADIO with Penny Smith: Interview with Daniel Bruce, CEO, Internews in Europe

February 19, 2019
Penny Smith of talkRADIO interviews Internews CEO, Daniel Bruce, about disinformation and media literacy

Listen to Daniel Bruce's interview on SoundCloud

"The two fundamental building blocks of tackling those problems [disinformation and misinformation], in addition to dealing with the wild west of digital and social media, have to be a stronger news media environment in countries all over the world, the UK included, and media literacy – the citizenry’s ability to understand that environment." — Daniel Bruce

Written transcript:

Announcer: Outspoken debate and essential listening – the no nonsense breakfast with Penny Smith in for Julia Hartley Brewer on TalkRADIO!


Penny Smith: Let us talk a bit more about what you heard in the news from Sandy Warr. Tech giants must no longer be left to police themselves. This is what MPs are arguing; they’re saying the era of Internet self-regulation has to come to an end. And they’re saying the code of ethics, which is proposed by the committee would define what constitutes harmful or illegal content, and social media companies who breach the code by failure to remove such material would face financial penalties.

Now in the studio Daniel Bruce, the CEO of Internews. It’s an international nonprofit company and pairing people worldwide with the trustworthy high-quality information they need. And Daniel, good morning to you. This is something that echoes Germany’s Facebook law, so that’s where social media companies can be fined up to 50 million Euros for failing to remove hate speech within 24 hours.

Daniel Bruce: I think it’s really important with this inquiry to kind of take ourselves back to where this all began which was an inquiry into fake news after the Brexit referendum and subsequent political events in the UK. When we look at the recommendations in today’s publication, 90% of them focus perhaps predictably on the role of social media organizations, big data and electoral law. Our contention would be that actually, if you take it back to the issue of fake news, which itself has become a somewhat discredited term, it’s more palatable now to talk about disinformation and misinformation.

But the two fundamental building blocks of tackling those problems, in addition to dealing with the wild west of digital and social media, have to be a stronger news media environment in countries all over the world, the UK included, and media literacy – the citizenry’s ability to understand that environment. So I feel some of those pieces are missing and it’s not a surprise at all that essentially we’ve got a regulatory response, which is proposed today.

Smith: In a bad way, you’re saying – this is not the right way to be going?

Bruce: Well, you mention the Germany example, which I think has clearly been effective to a point and that’s cited in very positive terms in today’s report. But hate speech, which is what the German law deals with, is one part of the challenge on social media generally. The French law around fake news and elections is also mentioned in a positive light in today’s report. And again, those are tiny pieces of the puzzle in what is a massively complex kind of digital chaos that’s created itself.

Smith: Well, it is chaos, isn’t it? I mean that’s the problem. What can you trust anymore? And that’s…you know, I’ve had nieces and nephews say, “Is it true that such and such…?” And I say, “Unlikely, where did you read it?” And you find out where it came from and you say, “Well, therefore the likelihood is that it’s disinformation or mis...” What is the difference between disinformation and misinformation.

Bruce: It is a very fine line. So disinformation is that which is created purposely, to mislead. Misinformation is accidental.

Smith: Oh, right. I did not know that. Did you two know that? You must have know that, surely Bobby [laugh].

Bobby: I feel like I have to say yes [laugh].

Smith: And this is the sort of thing they were talking about when they looked at allegations of Russian disinformation campaigns, i.e. targeted wrong information to make people do something that they would have not normally done if they had known the truth.

Bruce: Yes, exactly. It’s very interesting that after the committee published its interim report back in the summer, that 64% of the online traffic in response to that report – the IP addresses were traceable back to Russia. And I think it’s those nuances and complexities that I question if you can regulate them out fully.

Smith: That’s just what I was going to say. So how do you…how on earth would you go about trying to stop this slew of disinformation/misinformation when it’s so quick to post?

Bruce: Exactly. I think there was a very seminal study which came out from MIT last year which looked… Massachusetts Institute of Technology…and they looked at the whole existence of Twitter and 124,000 contentious news stories on Twitter, and they found that they – those fake news stories or disinformation spread up to twenty times faster than real news. And so that…I don’t think you can regulate…

Bobby: But they also found that it was a very small corner of Twitter. That is wasn’t actually having that much of an impact. So my real question is: You represent a bunch of media professionals, so you have an interest in protecting the existing news order rather than the social media, sharing…What is it that you want to happen instead of the regulatory response? What is it exactly?

Bruce: So, I think the first thing – can I talk about media literacy again – it’s touched upon a little bit in today’s report. It’s the kind of last five recommendations speaking about digital literacy. There’s a difference between literacy to find your way around social media and broader media and information literacy that helps you understand, if you like, the established order of legacy media – broadcasting and newspapers and so on and so forth.

Both this review today, this report today and the Cairncross review into high quality journalism in the UK that we had last week speak to the need to do more in this area. But they come out with conflicting recommendations and there’s kind of tacit acceptance in today’s report, well the government said in response to the interim report, “well, it’s kind of happening through the school system so that’s ok.” Attached to children’s online safety and things like that. I would contend that a very significant thing we could to is massively step up the focus on digital and media information literacy in this country.

Smith: In other words, concentrate in schools. Do education and say, “If you want to find out information, these are the trusted areas to go to and these are not…you have to look at where they come from. Is that not right? Is that what you’re saying?

Bruce: To give people the skills to make that discernment themselves. But we have to conflate this whole issue with the fact that the economic environment for legacy media in this country is so challenged. So to kind of ignore that in today’s report I think is an oversight.

Bobby: But isn’t it difficult to ask people to have a level of sophistication, that people who are casual users of media just aren’t going to have. The people that just read the odd news story. And the sites are very clever in the way present them. So surely there needs to be someone else stepping in to signpost the way for them. Will education just be enough?

Bruce: So I think that what I would like to see in education is that we…we ran a ten year program in Ukraine for example where we worked with the ministry of education, we worked in academia and we had a significant impact, as the research suggests, on citizens’ and young people’s ability to make the kind of discernments that you’re describing. And part of that is actually recognizing that sometimes you just don’t know the answers to those questions. And it’s giving people the skills to know who they can trust and why. So when they see something that might be a bit dubious or might be misleading or is in that kind of digital wild west or might be spreading at this great speed in Twitter, that they treat it with a pinch of salt. And the way that we’ve approached that is to teach it through a whole range of social disciplines from a very young age up to high school. And I don’t see the recommendations in the area going that far in that report.

Bobby: We don’t let people do that with television where it’s regulated. When newspapers where there are adjudications and corrections published. So why should we treat online, which essentially is doing the same thing – providing information. Why is that an exception?

Bruce: There’s a lot of joy and value that people get from social media. TV is regulated. You’re absolutely right but YouTube – a lot of people get value from the fact that they can do a vlog, they can recommend, they can cook a recipe, they can look up a recipe.

Smith: Yeah, you can do that but it’s about whether it’s dis…it’s about whether you’re putting…you know, you can whip up.

Bobby: You gotta make sure you don’t kill the goose that lays the golden egg. And crackdowns on social media. The idea that the government can decide what is hate speech and what isn’t. We’ve seen in the last few months there have been people who have been contacted because they re-tweeted an anti-transgender limerick or something. They were contacted to their employer. They have to decide if that’s hate speech or not. I’m very worried that the government is going to overreach. I think everyone should be.

Bruce: I completely agree with you. Exactly. Therein lies the problem. I think what we need to do from (?) media literacy and strengthen the environment. It’s great of understanding of regulation in those other spaces as well. I don’t think the average citizen understands the nuances between the way the press is regulated and the way broadcast is regulated but I think they should have that information available to them.

Smith: And that’s where we’re going to leave it. Daniel Bruce, thank you very much, CEO of Internews.