YouTube and Google are teaming up with Wikipedia to dynamically brainwash YouTube video viewers with unrequested textual reeducation sessions.
SW: This has been a year of fake news and misinformation and we have seen the importance of delivering information to our users accurately. There was a lot of stuff happening in the world a year ago. And we said, look, people are coming to our homepage and if we are just showing them videos of gaming or music and something really significant happened in the world, and we are not showing it to them, then in many ways we’re missing this opportunity. We had this discussion internally where people said, you know, ”What do those metrics look like, and are people going to watch that?” We came to the conclusion that it didn’t really matter. What mattered was that we had a responsibility to tell people what was happening in the world. So a year ago, we launched a few things. One of them was this top news shelf. So if you go to search, the information that we show at the top is from authoritative sources, and we limit that to authoritative sources. We also have that you, for example, can be in your home feed with news, looking at gaming, music, other information, something major happens in the world or in your region, and we decide that we’re going to show it to you.
NT: What is authoritative?
SW: Being part of Google, we work with Google News. Google News has a program where different providers can apply to be part of Google News, and then we use a different set of algorithms to determine who within that we consider authoritative. And then based on that we use those news providers in our breaking news shelf, and in our home feed.
NT: And what goes into those algorithms? What are some of the factors you consider when deciding whether something is authoritative or not?
SW: We don’t release what those different factors are. But there could be lots of different things that go into it. These are usually complicated algorithms. You could look at like the number of awards that they have won, like journalistic awards. You can look at the amount of traffic that they have. You could look at the number of people committed to journalistic writing. So, I’m just giving out a few there, but we look at a number of those, and then from that determine—and it’s a pretty broad set. Our goal is to make that fair and accurate.
NT: It’s super complicated because we don’t want to over-bias with established places and make it harder for a new place to come up. Facebook has started evaluating places based on how trustworthy they are and giving out surveys. And one of the obvious problems if you give a survey out and you ask, “Is that trustworthy?” and they’ve never heard of it, they won’t say yes. And that makes it harder for a startup journalistic entity. YouTube is, of course, the place where people start, so that’s tricky.
SW: It is tricky. There are many factors to consider. But the other thing we want to consider here is if there’s something happening in the world, and there is an important news event, we want to be delivering the right set of information. And so, we felt that there was responsibility for us to do that and for us to do that well. We released that a year ago. But I think what we’ve seen is that it’s not really enough. There’s continues to be a lot of misinformation out there.
NT: So I’ve heard.
SW: Yes, so you’ve heard. And the reality is, we’re not a news organization. We’re not there to say, “Oh, let’s fact check this.” We don’t have people on staff who can say, “Is the house blue? Is the house green?” So really the best way for us to do that is for us to be able to look at the publishers, figure out the authoritativeness or reputation of that publisher. And so that’s why we’ve started using that more. So one of the things that we want to announce today that’s new that will be coming in the next couple of weeks is that when there are videos around something that’s a conspiracy—and we’re using a list of well-known internet conspiracies from Wikipedia—that we will show as a companion unit next to the video information from Wikipedia for this event.
NT: YouTube will be sending people to text?
SW: We will be providing a companion unit of text, yes. There are many benefits of text. As much as we love video, we also want to make sure that video and text can work together.
NT: I love them both too.
SW: Yes, you must love text—as a writer. So here’s a video. Let’s see… “Five most believed Apollo landing conspiracies.” There is clear information on the internet about Apollo landings. We can actually surface this as a companion unit, people can still watch the videos, but then they have access to additional information, they can click off and go and see that. The idea here is that when there is something that we have listed as a popular conspiracy theory, the ability for us to show this companion unit.
NT: So the way you’ll identify that something is a popular conspiracy theory is by looking at Wikipedia’s list of popular conspiracy theories? Or you have an in-house conspiracy theory team that evaluates…and how does someone in the audience apply to be on that team? Because that sounds amazing.
SW: We’re just going to be releasing this for the first time in a couple weeks, and our goal is to start with the list of internet conspiracies listed where there is a lot of active discussion on YouTube. But what I like about this unit is it’s actually pretty extensible, for you to be able to watch a video where there’s a question about information and show alternative sources for you as a user to be able to look at and to be able to research other areas as well.
Translation: when you watch a Voxiversity video on YouTube, Google News is going to pop up infoboxes from Wikipedia that will totally disprove the dangerous badthought to which you are foolishly subjecting yourself.
Which gives me an idea….
Anyhow, as usual, the main challenge is that most conservatives would rather whine and cry about how the mean, unfair Left is being mean and unfair again rather than actually do anything about it. Here is yet another article decrying Wikipedia without mentioning the fact that Infogalactic already exists. Fortunately, someone in the comments rectified that failure; good work, Squidz Mackenzie. Keep in mind that if just one percent of the people who have publicly complained about Wikipedia bias simply joined the Burn Unit and edited Infogalactic three times per month, we’d already be threatening Wikipedia’s information supremacy.
Now, it will happen eventually. We are making constant progress and are gradually chipping away at it. But that progress is happening much more slowly than it could if conservatives would stop wasting so much time trying to improve the enemy.