The aim of of the WikiProject Source Metadata is:
- to act as a hub for work in Wikidata involving citation data and bibliographic data as part of the broader WikiCite initiative.
- to define a set of properties that can be used by citations, infoboxes, and Wikisource.
- to map and import all relevant metadata that currently is spread across Commons, Wikipedia, and Wikisource.
- establish methods to interact with this metadata from different projects.
- to create a large open bibliographic database within Wikidata.
- to reveal, build, and maintain community stakeholdership for the inclusion and management of source metadata in Wikidata.
Wikiquote
Wikiquote is a free online compendium of sourced quotations from notable people and creative works in every language, translations of non-English quotes, and links to Wikipedia for further information. Visit the help page or experiment in the sandbox to learn how you can edit nearly any page right now; or go to the Log in to start contributing to Wikiquote.
Meet the Man Behind a Third of What’s on Wikipedia
Steven Pruitt has made nearly 3 million edits on Wikipedia and written 35,000 original articles. It’s earned him not only accolades but almost legendary status on the internet.
Happy 18th birthday, Wikipedia. Let’s celebrate the Internet’s good grown-up.
Last year, Wikipedia co-founder Jimmy Wales told NPR that Wikipedia has largely avoided the “fake news” problem, raising the question of what the encyclopedia does differently than other popular websites. As Brian Feldman suggested in New York magazine, perhaps it’s simply the willingness within the Wikipedia community to delete. If a user posts bad information on Wikipedia, other users are authorized and empowered to remove that unencyclopedic content. It’s a striking contrast to Twitter, which allows lies and inflammatory statements to remain on its platform for years.
The Wikipedia community has also embraced automated technologies to protect the integrity of the encyclopedia. While YouTube scans videos for potential content violations using its Content ID database, the community of Wikipedia editors have created editing bots that go further by making determinations about content quality. For example, ClueBot NG quickly reverts probable vandalism based on its machine-learning algorithm and a database of common indicators such as expletives and poor punctuation. In 2016, YouTube courted controversy for attempting to enforce its policy against inappropriate language, with many vloggers alleging censorship. But a civility requirement makes sense for Wikipedians because the community shares a vision: to build a better encyclopedia.