Jump to content

Wikipedia:Wikipedia Signpost/2018-02-20/Recent research

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by Barbara (WVS) (talk | contribs) at 23:23, 9 February 2018. The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

A monthly overview of recent academic research about Wikipedia and other Wikimedia projects, also published as the Wikimedia Research Newsletter.

Recent research

2018 research studies begin making their appearance

Controversy is an organized sport for some editors but may alert readers that there is more than one view on a topic.

The study of controvery

"Computing controversy: Formal model and algorithms for detecting controversy on Wikipedia and in search queries"

Reviewed by Barbara Page

"Abstract

"Controversy is a complex concept that has been attracting attention of scholars from diverse fields. In the era of Internet and social media, detecting controversy and controversial concepts by the means of automatic methods is especially important. Web searchers could be alerted when the contents they consume are controversial or when they attempt to acquire information on disputed topics. Presenting users with the indications and explanations of the controversy should offer them chance to see the “wider picture” rather than letting them obtain one-sided views. In this work we first introduce a formal model of controversy as the basis of computational approaches to detecting controversial concepts. Then we propose a classification based method for automatic detection of controversial articles and categories in Wikipedia. Next, we demonstrate how to use the obtained results for the estimation of the controversy level of search queries. The proposed method can be incorporated into search engines as a component responsible for detection of queries related to controversial topics. The method is independent of the search engine’s retrieval and search results recommendation algorithms, and is therefore unaffected by a possible filter bubble.
Our approach can be also applied in Wikipedia or other knowledge bases for supporting the detection of controversy and content maintenance. Finally, we believe that our results could be useful for social science researchers for understanding the complex nature of controversy and in fostering their studies."[1]
full width content
  1. ^ Zielinski, Kazimierz; Nielek, Radoslaw; Wierzbicki, Adam; Jatowt, Adam. "Computing controversy: Formal model and algorithms for detecting controversy on Wikipedia and in search queries". Information Processing & Management. 54 (1): 14–36. doi:10.1016/j.ipm.2017.08.005.
Students learn their 'stuff' by editing

"Wikipedia in higher education: Changes in perceived value through content contribution"

Reviewed by Barbara Page

Abstract:

"ABSTRACT Wikipedia is a widely used resource by university students, but it is not necessarily regarded as being reliable and trustworthy by them, nor is it seen as a context in which to make content contributions. This paper presents a teaching and research project that consisted in having students edit or create Wikipedia articles and testing whether or not this experience changed their perceived value of the platform. We conducted our experience at Universitat Pompeu Fabra (Barcelona, Spain) and University of Niš (Niš, Serbia) with a total number of 240 students. These students edited articles and answered two questionnaires, one before and one after the exercise. We compared the pre and post experience answers to the questionnaires with a series of paired samples t-tests, through which our data showed that students did significantly change their perception of reliability and usefulness, and of likeliness of finding false information on Wikipedia. Their appreciation of the task of writing Wikipedia articles, in terms of it being interesting and challenge also increased. They did not significantly change, however, their judgement on the social value of the platform, neither in the university nor in the general context. In addition, the open questions and informal feedback allowed us to gather valuable insights towards the evaluation of the overall experience."[1]
  1. ^ Soler-Adillon, Joan; Pavlovic, Dragana; Freixa, Pere (2018). "Wikipedia in higher education: Changes in perceived value through content contribution". Comunicar (in Spanish). 26 (54): 39–48. doi:10.3916/c54-2018-04. ISSN 1134-3478. English version here

"Excavating the mother lode of human-generated text: A systematic review of research that uses the Wikipedia corpus"

Reviewed by Barbara Page
"Abstract Although primarily an encyclopedia, Wikipedia’s expansive content provides a knowledge base that has been continuously exploited by researchers in a wide variety of domains. This article systematically reviews the scholarly studies that have used Wikipedia as a data source, and investigates the means by which Wikipedia has been employed in three main computer science research areas: information retrieval, natural language processing, and ontology building. We report and discuss the research trends of the identified and examined studies. We further identify and classify a list of tools that can be used to extract data from Wikipedia, and compile a list of currently available data sets extracted from Wikipedia"[1]

If anyone ever wanted a fairly comprehensive list of research articles published about how things are measured, this article is valuable. It describes how data is gathered and used. One of the advantages of Wikipedia is its size and the parameters that can be lifted out of articles.

  1. ^ Mehdi, Mohamad; Okoli, Chitu; Mesgari, Mostafa; Nielsen, Finn Årup; Lanamäki, Arto. "Excavating the mother lode of human-generated text: A systematic review of research that uses the wikipedia corpus" (PDF). Information Processing & Management. 53 (2): 505–529. doi:10.1016/j.ipm.2016.07.003.

"Persistent Bias on Wikipedia: Methods and Responses"

Reviewed by Barbara Page

"Abstract Systematically biased editing, persistently maintained, can occur on Wikipedia while nominally following guidelines. Techniques for biasing an entry include deleting positive material, adding negative material, using a one-sided selection of sources, and exaggerating the significance of particular topics. To maintain bias in an entry in the face of resistance, key techniques are reverting edits, selectively invoking Wikipedia rules, and overruling resistant editors. Options for dealing with sustained biased editing include making complaints, mobilizing counterediting, and exposing the bias. To illustrate these techniques and responses, the rewriting of my own Wikipedia entry serves as a case study. It is worthwhile becoming aware of persistent bias and developing ways to counter it in order for Wikipedia to move closer to its goal of providing accurate and balanced information."[1]

  1. ^ Martin, Brian (2017). "Persistent Bias on Wikipedia: Methods and Responses". Social Science Computer Review. {{cite journal}}: Cite has empty unknown parameter: |1= (help)

"Information Fortification: An Online Citation Behavior"

Reviewed by Barbara Page

"ABSTRACT In this multi-method study, we examine citation activity on English-language Wikipedia to understand how information claims are supported in a non-scientific open collaboration context. We draw on three data sources—edit logs, interview data, and document analysis—to present an integrated interpretation of citation activity and found pervasive themes related to controversy and conflict. Based on this analysis, we present and discuss information fortification as a concept that explains online citation activity that arises from both naturally occurring and manufactured forms of controversy. This analysis challenges a workshop position paper from Group 2005 by Forte and Bruckman, which draws on Latour’s sociology of science and citation to explain citation in Wikipedia with a focus on credibility seeking. We discuss how information fortification differs from theories of citation that have arisen from bibliometrics scholarship and are based on scientific citation practices."[1]

  1. ^ Forte, Andrea; Andalibi, Nazanin; Gorichanaz, Tim; Kim, Meen Chul; Park, Thomas; Halfaker, Aaron (2018-01-07). "Information Fortification: An Online Citation Behavior" (PDF). ACM: 83–92. doi:10.1145/3148330.3148347. ISBN 9781450355629. {{cite journal}}: Cite journal requires |journal= (help)