TECH

Tracking Viral Misinformation – The New York Times

Written by admin

Tracking Viral Misinformation – The New York Times

Two decades ago, Wikipedia emerged as a bizarre online project that aimed to crowdsource and document all human knowledge and history in real time. Skeptics were concerned that most of the site would contain unreliable information, and that mistakes were often reported.

But now, the online encyclopedia is often cited as a place that, on balance, helps combat false and misleading information spreading elsewhere.

Last week, the Wikimedia Foundation, the group that oversees Wikipedia, announced that Marianna Iskander, a social entrepreneur in South Africa who has worked for years in nonprofits tackling youth unemployment and women’s rights, announced that its will become chief executive.

We spoke with her about her approach to the group and how the organization works to prevent false and misleading information on its sites and on the web.

Let us know your direction and vision for Wikimedia, especially in such a frightening information landscape and in this polarized world.

There are some fundamentals of Wikimedia projects, including Wikipedia, which I think are important starting points. This is an online encyclopedia. It’s not trying to be anything else. It certainly isn’t trying to be a traditional social media platform by any means. It has a structure led by volunteer editors. And as you know, the Foundation has no editorial control. This is a user-led community that we support and enable.

Lessons to be learned from not only what we are doing, but how we continue to iterate and improve, begin with this idea of ​​radical transparency. Everything is cited on Wikipedia. This is debated on our talk pages. So while people may have different points of view, those debates are still public and transparent, and in some cases actually allow the right kind of back and forth. I think that’s what’s needed in such a polarized society – you have to make room for the front and the back. But how do you do it in a way that is transparent and ultimately leads to a better product and better information?

And the last thing I would say is, you know, this is a community of extremely humble and honest people. As we look to the future, how do we build on the features this platform can continue to offer society and provide free access to knowledge? How do we ensure that we are reaching the full diversity of humanity in terms of who is invited to participate, what is written about? How exactly are we making sure that our collective efforts reflect more of the global South, reflect more women and reflect the diversity of human knowledge, to be more reflective of reality?

See also  Things Missing from Your Online Customer Service Strategy

What is your view on how Wikipedia fits into the wider problem of online disinformation?

Many of the main features of this platform are very different from some traditional social media platforms. In case you misunderstand COVID, the Wikimedia Foundation has partnered with the World Health Organization. A group of volunteers called Wikiproject Medicine focuses on medical content and creates articles that are then monitored very carefully because these are the types of topics you want to be mindful of about misinformation.

Another example is that the Foundation put together a task force ahead of the US elections, again, trying to be too proactive. [The task force supported 56,000 volunteer editors watching and monitoring key election pages.] And the fact that there were only 33 retractions on the main US election page was an example of how to focus on key topics where misinformation poses a real risk.

Then another example that I think is really cool is a podcast called “The World According to Wikipedia”. And in one episode, there’s a volunteer who’s interviewed, and she’s actually made it her job to be one of the main viewers of the climate change pages.

We have technology that alerts these editors when changes are made to any page so they can go and see what the changes are. There is an opportunity to temporarily lock a page if there is a risk that, in fact, misinformation may occur. No one wants to do this unless it is absolutely necessary. The climate change example is useful because the talk pages behind it are heavily debated. Our editors are saying: “Let’s argue. But it is a page that I am watching and monitoring carefully.”

One of the major debates currently taking place on these social media platforms is the issue of censorship of information. There are those who claim that biased views are preferred on these platforms and more conservative views are shunned. When you think about how to handle these debates once you head to Wikipedia, how do you make judgment calls with this event taking place in the background?

For me, what has been the inspiration about this organization and these communities has been the main pillars established on day one in the founding of Wikipedia. One of them is the idea of ​​presenting information from a neutral point of view, and that neutrality requires understanding of all sides and all points of view.

See also  Washington State Advances Landmark Deal on Gig Drivers' Job Positions

This is what I was saying before: debate the pages of the talk, but then come to an informed, documented, verifiable kind of conclusion on the articles. I think this is a basic principle, which could potentially provide something for others to learn.

Coming from a progressive organization fighting for women’s rights, have you given much thought to weaponizing your background about misinformation, to say that it may affect the calls you make? What is allowed on Wikipedia?

I will say two things. I would say that the really relevant aspects of the work I’ve done in the past are the volunteer led movement, which is probably a lot harder than the others, and I’ve played a really operational role in understanding how to build systems. , build culture and build processes that I think are going to be relevant to an organization and a group of communities that are trying to increase their scale and reach.

The second thing I would say is, I am again on my learning journey and invite you to join me on my learning journey. I choose to live in a world that we interact with others in a notion of good faith and that we engage in a respectful and civilized manner. It doesn’t mean that other people are going to do it. But I think we have to stick to that as an aspiration and as a way, as you know, that is the change that we want to see in the world as well.

When I was in college, I used to do a lot of my research on Wikipedia, and some of my professors would say, ‘You know, that’s not a valid source.’ But I still use it all the time. I wonder if you have any idea about this!

I think now most professors admit they even sneak up on Wikipedia to look things up!

You know, we’re celebrating the 20th year of Wikipedia this year. On one hand, here was the thing that I think people used to make fun of and say won’t go anywhere. And it has now legitimately become the most referenced source in the entire human history. I can tell you from my conversations with academics that the narrative on Wikipedia about sources and how to use Wikipedia has changed.

#Tracking #Viral #Misinformation #York #Times

About the author

admin

Leave a Comment

%d bloggers like this: