Wikipedia tasks an AI with catching low quality edits
The maintenance of Wikipedia requires such a high volume of human administrators to regulate content and edits that the site is now looking at introducing an AI to police adjustments.
To help Wikipedia stay on top of its half a million daily edits, an AI called The Objective Revision Evaluation Service (ORES), will function "like a pair of X-ray specs" providing another layer of quality control on the site.
Utilising machine learning, it will analyse ammendments made to the site, using selected edits as a benchmark upon which to compare new entries.
With the MIT Technology Review estimating that there are only around 30,000 volunteer English language editors, down 40 per cent in eight years, ORES has been enlisted to pick up the slack.
It scores man-made edits, judging whether or not the additional content damages entries deliberately or accidentally. This will prioritise articles that need to be amended by editors – with them now capable of providing feedback in order to help new contributors.