When you have vast swathes of the human population interacting on social media sites such as Facebook and Twitter then it tends to naturally devolve into bubbles. For example those that support Hillary will be sufficiently motivated by the quite frankly absurd postings made by Trump supports to simply block and purge those people, and in turn the Trump supporters will behave in a similar manner, anything pro-Hillary will be used to refine the list of feeds and posters that they subscribe to. The net effect is that bubbles of information (or should that be myth-information) will quite naturally emerge and thus insulate people from other conflicting views.
Wikipedia is different, very different.
The Washington Post has an interesting article that revolves around a new study that highlights why it is distinctly different. There are a couple of points to draw out from that article.
First the obvious one, Wikipedia is incredibly useful to us all, hence it truly is very very popular …
It remains the fifth-most-visited site in the world, attracting nearly 8 billion page views in August alone. There are 5.2 million articles just on English Wikipedia, and it’s still growing by about 20,000 articles a month.
The second is that it quite effectively punches through the information bubble that is emerges on Social Media and enables people with distinctly diverse views to aggregate information and collaborate …
large groups of people are able to act rationally and solve problems despite having vastly different interests.
Well yes, but we do live in a reality where there are very polarised views, so it is inevitable that disagreements will merge ..
epic stories of “edit wars” litter the history of Wikipedia. Historians have raged over whether to use B.C./A.D. or B.C.E./C.E. notation to discuss when Jesus was born. Pop music aficionados have battled over whether Nelly Furtado is Canadian or Portuguese-Canadian. And pray you never get involved in the long-standing feud over how to pronounce J.K. Rowling’s last name (even though the author herself pronounces it “rolling”).
However, the end result does manage to rise above all of that …
what seems to hold it together is to have one paragraph for both sides. Space is free for all intents and purposes, and we can … edit each other’s paragraphs for accurate points of view.
To work a simple example for you.
- Here is the Wikipedia page on evolution. – It lays out what it actually is and is well cited. It not only describes in detail the mechanisms, but also includes a section on the social and cultural responses.
- Here is the Wikipedia page on creationism – It starts off by correctly identifying it as a religious belief, then proceeds to lay out the vast diversity of distinctly different conflicting variations of these creationist beliefs that prevail.
Clearly one is the prevailing scientific consensus that is verified by mountains of evidence and the other is just a belief system that has no credible evidence. As an on-line neutral source of information Wikipedia simply lays out the facts. If you were a creationist reading the evolution page, you might not accept what it is telling you, but you will at least not be annoyed by tripping up over any criticism of creationism, because it just lays out the topic of evolution and cites reliable sources. Spin that coin, and get an evolutionary biologist to read the page on Creationism and something similar is found. There it simply lays out what the various strands of creationist believe and does not litter the page with unsubstantiated claims that it is all unassailable “truth”.
As a neutral source of information, it really does work.
The New Paper
The motivation for the Washington Post article is the publication of a new paper by Shane Greenstein, Yuan Gu and Feng Zhu of Harvard Business School that highlights that Wikipedia is basically brilliant at finding neutrality, even for the most contentious topics, and that basically is why it is perhaps the very best source of information that we now have. There truly has never ever been anything like it in all of human history.
The paper itself is here on the Harvard Business School website…
For some odd reason the Washington Post article links to some other site that has the abstract and then wants you to pay $5 to get the full PDF, but if you go directly to the Harvard Business School website (As I did in the link above), you can simply click and read without choking on a paywall.
The abstract for it reads …
Do online communities segregate into separate conversations when contributing to contestable knowledge involving controversial, subjective, and unverifiable topics? We analyze the contributors of biased and slanted content in Wikipedia articles about U.S. politics and focus on two research questions: (1) Do contributors display tendencies to contribute to sites with similar or opposing biases and slants? (2) Do contributors learn from experience with extreme or neutral content, and does that experience change the slant and bias of their contributions over time? The findings show enormous heterogeneity in contributors and their contributions, and, importantly, an overall trend towards less segregated conversations. A higher percentage of contributors have a tendency to edit articles with the opposite slant than articles with similar slant. We also observe the slant of contributions becoming more neutral over time, not more extreme, and, remarkably, the largest such declines are found with contributors who interact with articles that have greater biases. We also find some significant differences between Republicans and Democrats.
… and that, to be quite frank, highlights it as a rather interesting read.
Sanity Warning, it runs to 41 pages.
In essence, what they have observed is that wikipedia editors do not just edit content on topics they agree with, they also edit content on topics that they do not agree with. That however needs to be done in a manner that is acceptable to all. The end effect is that it tends to push articles away from a strong bias towards one specific viewpoint and moves them into a more neutral position.
This does also offer fascinating insights into how small tweaks in how we interact online can potentially make a huge difference in the quality of information. Wikipedia enables anybody to add or remove anything and so contributors learn from experience to moderate their contributions if they wish to see it stick and not get purged by another contributor. In contrast, contributors in Facebook/Twitter only add additional content on top of what is already there and that very much changes things by enabling any whacky view to shout, and thus motivates those who see things differently to simply block, and so bubbles naturally emerge.
One last observation. In the context of contributors learning from experience to moderate their contributions for topics they do not agree with to a degree that enables them to be acceptable, they note that …
this slant convergence process takes one year longer on average for Republicans than for Democrats
Why oh why does that not surprise me.