Wikimedia is adding features to make editing Wikipedia more fun
Wikipedia is one of the sturdiest survivors of the old web, as well as one of the most clearly human-powered ones, thanks to a multitude of editors making changes across the globe. But after celebrating the site’s 20th birthday last year, the Wikimedia Foundation is turning to new — and more heavily automated — tools in search of its next wave of contributors. It’s adding features designed to ease users into making their own edits, including suggestions for easy first steps like cross-linking different articles. It’s doing so while trying not to weaken the bonds of its individual communities — and, the Wikimedia team hopes, possibly even making them stronger.
Wikimedia has been testing extra features for newcomers since 2019, and it’s now officially announcing them on a platform-wide level. Users who log into Wikipedia accounts will see a landing page for new editors. They’ll be assigned a mentor from a pool of more experienced site veterans who can answer questions. And via the landing page, they’ll be urged to start making small edits, sometimes suggested by a Wikimedia-trained machine learning system.
“A lot of people would attempt to start editing but fail and not stick around.”
“The Wikimedia Foundation was noticing that there were problems with the retention of new editors, meaning that a lot of people would attempt to start editing but fail and not stick around,” explains lead product manager Marshall Miller. The team began a research project in 2018 to test new methods of getting people to stick around — first on relatively small wikis like the Czech and Korean-language versions of Wikipedia, then on larger ones, culminating in an English-language launch earlier this year.
Most people, according to Wikimedia’s surveys, start editing Wikipedia because they’ve got a particular task in mind — like writing a new article about something they’re interested in, contributing to an existing article, or fixing a typo. But they often don’t know how to start, and Wikipedia’s editing community can be notoriously — to use a gentle word — persnickety. There are good reasons for this: the site has become a widely trusted fact-checking resource, and a high-quality bar helps keep it that way. But it means a large portion of first edits are rejected, setting people up to feel like they’ve failed before even getting started. On big wikis, there’s an entrenched set of rules that can make participation challenging, while on smaller ones that don’t get as many visitors, there can be less of a clear incentive to participate.
“The way we’ve been thinking about these features is kind of starting from a place of saying: it is so hard to edit Wikipedia. There are so many barriers to entry. And there’s kind of two ways that we could attack that. One was to say, ‘Let’s teach people how to do it.’ And so we’ve done some of that,” says Miller. “The other way was to say, ‘Wikipedia editing is so hard. Let’s make easy ways to edit.’”
“With one thumb, you can be editing while you hold onto the rail on the bus.”
Mentorship is part of that first avenue of attack. The global Wikipedia community currently has 584 people signed up to mentor newcomers; its largest individual encyclopedia, the English-language Wikipedia, has 86. (Around 122,000 accounts have made an edit on English Wikipedia in the last month.) Mentors don’t work closely with every Wikipedian they’re assigned, but users are encouraged to email them with questions — many of which are fairly simple but can benefit from a one-on-one interaction with another person.
The second is to nudge newcomers toward simple edits they’re more likely to make without error and suggest ways to participate. In addition to its standard edit tab, Wikimedia is adding guidance for suggested newcomer tasks like copy-editing and an option called “structured tasks,” which includes things like adding relevant images and cross-wiki links to pages. A machine learning algorithm will suggest page images and links in topics that new editors say they’re interested in, and the editors can approve or reject them, functioning as a human-level filter to an AI system. “These are some of the first edits that you can do with one hand on your phone — like with one thumb, you can be editing while you hold onto the rail on the bus,” says Miller.
The algorithm’s own accuracy rate isn’t exemplary: editors deem about 75 percent of the link recommendations accurate, and the number is between 65 and 80 percent for images, varying by wiki. But 90 percent of the edits that humans make with them are retained. The system isn’t available on English-language Wikipedia yet — it’s still being trialed on smaller wikis — but Wikimedia plans to eventually make it available everywhere.
Wikimedia’s new system is designed to offer lots of these interface-based rewards. An “impact” section on the newcomer page, for instance, will show people how many pageviews the articles they edited have received, giving them a sense of the difference they’re making. In tests, people who see the new features are about 16 percent more likely to make their first edit and — for people who start the process — are 16 percent more likely to come back and make another.
If you’ve used apps like Duolingo or Tinder, these little nudges might seem familiar. They’re a kind of gamification: a way to turn a daunting task into a series of small actions with symbolic awards. These systems often come in for criticism, too — described as “addictive” or manipulative.
“Part of our design is — how can the user realize that they want to discover more?”
But the Wikimedia team sees its work as structurally different. For one thing, there’s no real profit motive on Wikipedia — the goal isn’t to get people “hooked” on contributing but to get them comfortable with the process. For another, this work is being carried out in public, with the results of individual trials and proposals documented online where the global editorial community can weigh in.
Some of the resulting discussions are high-level, while others are extremely specific to individual wikis. “They’re involved in helping even designing the different algorithms for the different languages,” says lead designer Rita Ho — Vietnamese-language Wikipedia, for instance, needed its algorithm tweaked to account for how the language defines the beginnings and endings of words. An individual wiki’s administrators can also opt to turn the features off — although, so far, Ho and Miller say that’s been rare.
While these changes are largely technical, the goal is to help build up the number of people who feel comfortable connecting with other humans in Wikipedia’s community, particularly in smaller wikis that badly need new editors. Systems like structured tasks are supposed to let people dip their toes in the water — but eventually, they’ll have to jump in.
“There are community members who are concerned that the more newcomers interact with automated processes, the less they understand the fundamentals of the wiki process, the community-based process,” acknowledges Miller. “Because these communities, even though they need images and they need links, they also need their future administrators, their future people that discuss policy, the future people that write full articles from whole cloth. And so part of our design is — how can the user realize that they want to discover more and get deeper into this?”
Read the full article Here