HomeTechnologyNewsMeta wants to solve Wikipedia's biggest problem using artificial intelligence

Meta wants to solve Wikipedia’s biggest problem using artificial intelligence

- Advertisement -
- Advertisement -
- Advertisement -
- Advertisement -

goal, wikipedia

Despite the efforts of more than 30 million editors, Wikipedia is not perfect. Some information on Wikipedia lacks a genuine source or citation; as we learned with the Pringle Man hoax, this can have a far-reaching impact on culture or “facts.” But Meta, formerly Facebook, hopes to solve Wikipedia’s big problem with AI.

Note: To be clear, this is an independent project by researchers at Meta AI, a division of the Meta corporation. The Wikimedia group is not involved and is not using SIDE to automatically update articles.

As detailed in a blog post and research paper, the Meta AI team created a dataset of over 134 million web pages to build a dating verification AI called SIDE. Using natural language technology, SIDE can analyze a Wikipedia citation and determine if it is appropriate. You can also find new sources of information already published on Wikipedia.

An example of how SIDE can check and suggest new citations on Wikipedia.
goal AI

Meta AI highlights the Blackfoot Confederacy Wikipedia article as an example of how SIDE can improve citations. If you scroll to the end of this article, you will learn that Joe Hipp was the first Native American to challenge for the WBA World Heavyweight Title, an interesting fact that is 100% true. But here’s the problem; whoever wrote this fact quoted a source that has nothing to do with Joe Hipp or the Blackfeet tribe.

In this case, the Wikipedia editors did not verify the veracity of a citation (the problem has been fixed). But if the editors had SIDE, they might have caught the misquote sooner. And they wouldn’t have to search for a new date, since SIDE would automatically suggest one.

At least, this is the hypothesis put forward by the Meta AI researchers. While SIDE is certainly an interesting tool, we still can’t trust the AI ​​to understand the language, context, or veracity of anything posted online. (To be fair, the Meta AI research paper describes SIDE as more of a “demo” than a working tool.)

Wikipedia editors can now test SIDE and assess its usefulness. The project is also available on Github. For what it’s worth, SIDE seems like a super-powerful version of the tools Wikipedia editors already use to improve their workflow. It’s easy to see how such a tool could flag citations for humans to review, at the very least.

Source: Meta AI

- Advertisement -
- Advertisement -
Stay Connected
Must Read
- Advertisement -
Related News
- Advertisement -
%d bloggers like this: