This is a draft version. Final version is at [http://blog.wikimedia.org/2012/10/24/fix-this-broken-workflow/](http://blog.wikimedia.org/2012/10/24/fix-this-broken-workflow/).
Wikipedia is more than 10 years old, and has hundreds of millions of visitors a month. Here at the Wikimedia Foundation, we’re very proud to support such a project. But despite being a household name, issues with our user experience are deeply troubling.
This is especially true for the smaller contingent of people who are the regular contributors to the encyclopedia. Today, across the Wikipedias, there are around 80,000 people who make 5 or more edits to the site every month, and they are determined to improve it. As Wikipedia’s user interface and their user experience has failed to keep pace with the the encyclopedia’s growth, their community, like the proverbial Little Dutch Boy, has creatively built workarounds to accomplish their goal in creating and curating the content nearly a half billion people rely on every month.
Despite their efforts, the lack of a modernized editor experience has contributed to a decline in active editorship starting around 2007:
In particular, this has created an increasingly steep barrier to new editors as they attempt to quickly learn the many kludges experienced ones slowly accreted to keep the encyclopedia together—their numbers start to decline sharply starting as early as 2005. Without this funnel of new editors becoming experienced ones, the encyclopedia faces a slow, inexorable heat death.
This trend has been the ever present Sword of Damocles that motivates all of us here working on editor engagement.
The Editor Engagement Experiments team tries to reverse this trend by defining, measuring, and fixing these important editing workflows and improving the experience of Wikipedia editors who create content used by people all around the world.
The problem (by example)
Imagine you want to create an article for English Wikipedia. You search for the term or you enter in the URL directly. If you’re not logged in, the first hurdle will the site will simply tell you that you don’t have permission to start the page. At this point, most people would give up — you just need to create an account; we just don’t tell you up front.
Let’s say you manage to log in or register and then get back to the task at hand. Great. But not so much if you’re brand new to Wikipedia, because all we do is dump a blank text box on you and hope you know what you’re doing. There’s no warning that poor articles will be swiftly deleted, and that you should get your feet wet by one of several workflows that are safer alternatives to starting a page immediately if you’re not 100 percent sure about it being appropriate or complete.
Thousands of people are subjected to this experience every month, and all they’re trying to do is help improve Wikipedia. If this fact makes you a little bit angry, keep reading.
Habits & Affordances: Defining the problem
By analogy, one way of understanding the nature of the problem is to describe the workflows of active editors in terms of habits. In The Power of Habit, Charles Duhigg describes habits that power our lives, organizations, and movements as a cycle of cue, routine, and reward.
An active editor responds to a rich set of cues: red linking, article stubs, cleanup templates, Special Pages such as New Pages Feed, WikiProject attention needed requests, watchlists, User talk notifications, etc. They engage in routines that can vary from the simple to the complex, often assisted by gadgets like Huggle or bots on the Wikimedia Toolserver. And, finally, they receive a reward of adding a new article to the encyclopedia, fixing errors, fighting vandals and spam. These loops become so ingrained that they quickly moved from first-time use (or a vandal themselves) to our most activee editors, and they develop such mastery and expertise that a diminishing number have been able to manage an increasingly larger encyclopedia.
The problem is that these cues, routines, and rewards were simpler and made sense to the user of a decade ago. But, to the new user, none of these resemble the affordances of the web today: people expect to edit without coding, hit the reply button on their talk page, and they don’t expect to be able to be bold and fix a universally-accessible article on Wikipedia.
A/B Testing: Measuring and Experimentation
Returning to the original example of the new editor. The way Editor Engagement Experimentation seeks to address this is very simple. Instead of either turning away interested editors who aren’t logged in, or leaving new editors to the fates by not properly instructing them about the routines to creating a good article, we’d like to create an uncomplicated landing page system, one that gives proper cues to the editor:
- they should log in if they aren’t already;
- that they can create an article now, but it is subject to high standards; or
- they can use their personal sandbox to start an article in safety.
The goal here is to support authors of new articles on Wikipedia by making it clear the various methods for starting a new topic, each of which has varying advantages depending on your experience level and the free time you have to devote to the project.
The approach to ensure this is via A/B testing to improve and optimize a specific aspect of a specific habit loop. Later this can be expanded for new habits to create better on-boarding such as cuing low-risk, high-reward tasks on the community portal and educating them to ensure routine completion.
While this example focuses on the cueing first time page creators and directing them through the routines that successful editors use to create pages, previous and existing experiments address other aspects of the myriad of habits that engage the user to become editors in the community. Recent examples include:
Our vision: the unique challenge of Editor Engagement Experimentation
Around the same time as our editorship started declining, the application of A/B testing to the web have created two major growth booms in the commercial web: viral marketing and gamification.
But Editor Engagement Experimentation cannot simply be blind application of A/B testing to steal back the cognitive surplus that shifted in the last decade from Wikipedia and the sister projects to social networks and social gaming. The reason why is summed up in our Vision statement:
Imagine a world in which every single human being can freely share in the sum of all knowledge. That’s our commitment.
The goal of the commercial web is money. Viral marketing and gamification are designed to simply optimize the quantity of users or of their time because those two aspects are intimately tied to the quantity of their revenues and of their profit. But our vision statement ties us, not to the quantity of our editors or the quantity of our content, but to the quality of our content and the quality of interaction every single human being has with that content, being freely shared.
Reversing the editor decline is not an end in itself — it is only so important as this reversal improves the quality of the sum of all human knowledge — measuring editor engagement quantities are only a rough proxy for this. Our vision makes the approach a unique challenge in the realm of engagement and experimentation: use tools designed to optimize quantities to improve qualities that cannot be directly measured. This similarly defines the uniqueness of the Wikimedia Foundation as an organization, Wikipedia as an encyclopedia, and our entire community of readers and editors as a movement.
Conclusion
We hope after reading this, you identify with the problem of editor decline as a threat to our shared vision and are supportive of Editor Engagement Experimentation as one approach in tackling this problem.
If you see the editor decline and the limitations of the experimental approach, not as an intractable problem or as a stress, but as a unique challenge, the team invites you to find a way to participate in a manner that helps us address this challenge to our editors, our community, and our movement.
And, for one of you specifically, who are interested in testing their mettle with this project, we have a couple open positions below:
Our example is not the most complex engineering task we ask of our team, but it’s shows how we can make a positive difference to the volunteers that create and maintain the world’s biggest encyclopedia. If you’re game to build a prototype of this solution, then by all means, please be bold and show us.
(We’ve already built an internal toolchain for delivering controlled experiments and gathering data, so, you’d be responsible for delivering the frontend and making it jibe with our system. Working on our team, you’d also have support from researchers, analysts, designers and product managers, so don’t worry too much about number crunching or visual details.)
If you can architect a solution that will work for the 900+ pages created each day on English Wikipedia alone, then not only will we deploy your code, but you’ll make a difference to experience of every editor creating the sum of all knowledge and every single human being who reads it.
That’s a promise you don’t hear every day, and, we’d love for you to join us.
Terry Chay, Director of Features Engineering
Steven Walling, Associate Product Manager