Researchers from the Stevens Institute of Technology (SIT) have recently finished an analysis of all the Wikipedia bots which work to maintain and improve the massive online encyclopedia. According to TechXplore, the results of the research could potentially inform the way that bots are used to develop commercial applications in fields like microchip design and customer service.
When Wikipedia first launched back in 2001, it had around 13,000 articles. Fast forward 18 years later and Wikipedia is home to a vast amount of information, over 40 million articles total, contributed to by over 500 million monthly users. In order to maintain all these of these articles, Wikipedia leverages 137,000 volunteer editors in a large body of bots, driven by simple AI programs. These bots are in charge of fixing tags, broken links, fixing typos, eliminating junk entries, and more.
The research team utilized computer algorithms to classify the bots by the functions that they carried out. The researchers were able to conduct an analysis of how AI programs and humans interact when engaging in large scale projects like maintaining a large repository of data such as Wikipedia. Understanding the way people and bots interact is a major focus for the field of Human-Computer Interaction, and as such the study was recently published in the Proceeding of the ACM of Human-Computer Interaction.
Jeffrey Nickerson, one of the study’s authors and a professor at the School of Business at (SIT), explained that AI is making massive changes in the way knowledge is produced and maintained and that Wikipedia’s size and ubiquity make it an excellent place to study these changes. Nickerson explained that to TechXplore that in the future it is likely we will all be working alongside AI in some capacity and therefore it’s important to understand how bots impact people’s decisions and how bots can be made into more effective tools.
Wikipedia made a great case study for the researchers because of its detailed record-keeping and transparency. The research team employed automatic classification algorithms to apply labels to bots and develop a map detailing how bots interact with each other on Wikipedia. Certain clusters of functions could be analyzed and the bots that carried out those functions labeled with descriptions like “Advisor” or “Fixer”. Fixers take care of vandalism and repair broken links while Advisors give tips to editors and suggest new tasks. There were also “Connectors” which are responsible for establishing links between different resources or pages.
The researchers found that Wikipedia bots played nine main roles on the site, and that these bots accounted for around 10% of all Wikipedia activity. Furthermore, on certain subsections of the site, such as the Wikidata platform, bots account for about 88% of the site’s activity. Most activity is carried out by the approximately 1200 fixer-bots that repair the site and are responsible for over 80 million edits. In contrast, while there are fewer advisor bots, they help shape people’s interactions with the site, guiding what kinds of edits are made and what kinds of features are created.
One way Wikipedia leverages the power of bots is by greeting new members of the community. When people join online communities they are more likely to stay as active members if they are greeted by other members of the community. This apparently proves true even if the community member welcoming them is a bot. Bots encourage community members to stay around and contribute to the community by pointing out errors, as long as they were cordial about these corrections. As Nickerson explained to TechXplore:
“People don’t mind being criticized by bots, as long as they’re polite about it. Wikipedia’s transparency and feedback mechanisms help people to accept bots as legitimate members of the community.”
As bots become more and more important to the maintenance of growing online communities, studying how Wikipedia has leveraged bots can help other companies and entities create bots that help human users and encourage prosocial activity. Both Wikipedia’s successes and failures with bots should be critically examined.
“By studying Wikipedia, we can prepare for the future, and learn to build AI tools that improve both our productivity and the quality of our work,” said Nickerson.
- Attention-Based Deep Learning Networks Could Improve Sonar Systems
- Cerebras CS-1 System Integrated Into Lassen Supercomputer
- Deepfaked Voice Enabled $35 Million Bank Heist in 2020
- Facebook: ‘Nanotargeting’ Users Based Solely on Their Perceived Interests
- IBM Announces AI-Driven Software for Environmental Intelligence