Crowdsourcing
{{Short description|Sourcing services or funds from a group}}
{{Redirect|Crowd work|the performing arts term|audience participation}}
{{Essay-like|date=September 2022}}
{{Use dmy dates |date= October 2020}}
Crowdsourcing involves a large group of dispersed participants contributing or producing goods or services—including ideas, votes, micro-tasks, and finances—for payment or as volunteers. Contemporary crowdsourcing often involves digital platforms to attract and divide work between participants to achieve a cumulative result. Crowdsourcing is not limited to online activity, however, and there are various historical examples of crowdsourcing. The word crowdsourcing is a portmanteau of "crowd" and "outsourcing".{{cite book|author1=Schenk, Eric|author2=Guittard, Claude|date=1 January 2009|title=Crowdsourcing What can be Outsourced to the Crowd and Why|url=https://hal.inria.fr/halshs-00439256v1|publisher=Center for Direct Scientific Communication|access-date=1 October 2018|via=HAL}}{{cite book|chapter-url= http://i3wue.de/staff/matthias.hirth/author_version/papers/conf_410_author_version.pdf|doi= 10.1109/IMIS.2011.89|chapter= Anatomy of a Crowdsourcing Platform – Using the Example of Microworkers.com|title= 2011 Fifth International Conference on Innovative Mobile and Internet Services in Ubiquitous Computing|year= 2011|last1= Hirth|first1= Matthias|last2= Hoßfeld|first2= Tobias|last3= Tran-Gia|first3= Phuoc|pages= 322–329|isbn= 978-1-61284-733-7|s2cid= 12955095|access-date= 5 September 2015|archive-date= 22 November 2015|archive-url= https://web.archive.org/web/20151122025307/http://i3wue.de/staff/matthias.hirth/author_version/papers/conf_410_author_version.pdf|url-status= dead}}{{Citation | last1 = Estellés-Arolas | first1 = Enrique | last2 = González-Ladrón-de-Guevara | first2 = Fernando | title = Towards an Integrated Crowdsourcing Definition | journal = Journal of Information Science | volume = 38 | issue = 2 | year = 2012 | pages = 189–200 | url = http://www.crowdsourcing-blog.org/wp-content/uploads/2012/02/Towards-an-integrated-crowdsourcing-definition-Estell%C3%A9s-Gonz%C3%A1lez.pdf | doi = 10.1177/0165551512437638 | hdl = 10251/56904 | s2cid = 18535678 | access-date = 16 March 2012 | archive-date = 19 August 2019 | archive-url = https://web.archive.org/web/20190819041024/http://www.crowdsourcing-blog.org/wp-content/uploads/2012/02/Towards-an-integrated-crowdsourcing-definition-Estell%C3%A9s-Gonz%C3%A1lez.pdf | url-status = dead }} In contrast to outsourcing, crowdsourcing usually involves less specific and more public groups of participants.Brabham, D. C. (2013). Crowdsourcing. Cambridge, Massachusetts; London, England: The MIT Press.{{cite journal | last1 = Brabham | first1 = D. C. | year = 2008 | title = Crowdsourcing as a Model for Problem Solving an Introduction and Cases | journal = Convergence: The International Journal of Research into New Media Technologies | volume = 14 | issue = 1| pages = 75–90 | doi= 10.1177/1354856507084420 | citeseerx = 10.1.1.175.1623 | s2cid = 145310730 }}Prpić, J., & Shukla, P. (2016). Crowd Science: Measurements, Models, and Methods. In Proceedings of the 49th Annual Hawaii International Conference on System Sciences, Kauai, Hawaii: IEEE Computer Society. {{arxiv|1702.04221}}
Advantages of using crowdsourcing include lowered costs, improved speed, improved quality, increased flexibility, and/or increased scalability of the work, as well as promoting diversity.{{cite conference |title= A Systematic Literature Review of Crowdsourcing Research from a Human Resource Management Perspective |last= Buettner |first= Ricardo |year= 2015 |conference= 48th Annual Hawaii International Conference on System Sciences |conference-url= http://www.hicss.hawaii.edu/hicss_48/apahome48.htm |publisher= IEEE |location= Kauai, Hawaii |pages= 4609–4618 |isbn= 978-1-4799-7367-5 |doi= 10.13140/2.1.2061.1845}}{{cite journal|last1= Prpić|first1= John|last2= Taeihagh|first2= Araz|last3= Melton|first3= James|title= The Fundamentals of Policy Crowdsourcing|journal= Policy & Internet|date= September 2015|volume= 7|issue= 3|pages= 340–361|doi= 10.1002/poi3.102|arxiv= 1802.04143|s2cid= 3626608}} Crowdsourcing methods include competitions, virtual labor markets, open online collaboration and data donation.{{cite journal |last1=Afuah |first1=A. |last2=Tucci |first2=C. L. |year=2012 |title=Crowdsourcing as a Solution to Distant Search |journal=Academy of Management Review |volume=37 |issue=3 |pages=355–375 |doi=10.5465/amr.2010.0146|url=https://infoscience.epfl.ch/record/180049/files/afuah_tucci_AMR_2012_FINAL.pdf }}de Vreede, T., Nguyen, C., de Vreede, G. J., Boughzala, I., Oh, O., & Reiter-Palmon, R. (2013). A Theoretical Model of User Engagement in Crowdsourcing. In Collaboration and Technology (pp. 94–109). Springer Berlin Heidelberg{{Cite journal |last1=Sarin |first1=Supheakmungkol |last2=Pipatsrisawat |first2=Knot |last3=Pham |first3=Khiêm |last4=Batra |first4=Anurag |last5=Valente |first5=Luis |date=2019 |title=Crowdsource by Google: A Platform for Collecting Inclusive and Representative Machine Learning Data |url=https://www.humancomputation.com/2019/assets/papers/143.pdf |journal=AAAI Hcomp 2019}} Some forms of crowdsourcing, such as in "idea competitions" or "innovation contests" provide ways for organizations to learn beyond the "base of minds" provided by their employees (e.g. Lego Ideas).{{Cite journal|last1= Liu|first1= Wei|last2= Moultrie|first2= James|last3= Ye|first3= Songhe|date= 2019-05-04|title= The Customer-Dominated Innovation Process: Involving Customers as Designers and Decision-Makers in Developing New Product|journal= The Design Journal|volume= 22|issue= 3|pages= 299–324|doi= 10.1080/14606925.2019.1592324|s2cid= 145931864|url= https://www.repository.cam.ac.uk/handle/1810/341960}}{{Citation | last1 = Schlagwein | first1 = Daniel | last2 = Bjørn-Andersen | first2 = Niels | title = Organizational Learning with Crowdsourcing: The Revelatory Case of Lego | journal = Journal of the Association for Information Systems | volume = 15 | issue = 11 | pages = 754–778 | year = 2014 | url = http://aisel.aisnet.org/cgi/viewcontent.cgi?article=1693&context=jais | format= PDF | doi = 10.17705/1jais.00380 | s2cid = 14811856 }}{{Promotion inline|date=September 2022}} Commercial platforms, such as Amazon Mechanical Turk, match microtasks submitted by requesters to workers who perform them. Crowdsourcing is also used by nonprofit organizations to develop common goods, such as Wikipedia.{{Cite journal|last= Taeihagh|first= Araz|date= 2017-06-19|title= Crowdsourcing, Sharing Economies, and Development|journal= Journal of Developing Societies|volume= 33|issue= 2|doi= 10.1177/0169796x17710072|page= 0169796X1771007|arxiv= 1707.06603|s2cid= 32008949}}
Definitions
The term crowdsourcing was coined in 2006 by two editors at Wired, Jeff Howe and Mark Robinson, to describe how businesses were using the Internet to "outsource work to the crowd", which quickly led to the portmanteau "crowdsourcing". The Oxford English Dictionary gives a first use: "OED's earliest evidence for crowdsourcing is from 2006, in the writing of J. Howe."{{cite web| title=crowdsourcing (noun)| url=https://www.oed.com/dictionary/crowdsourcing_n?tab=factsheet#288590721| publisher=Oxford English Dictionary| date=2023| access-date=3 January 2024}} The online dictionary Merriam-Webster defines it as: "the practice of obtaining needed services, ideas, or content by soliciting contributions from a large group of people and especially from the online community rather than from traditional employees or suppliers."{{cite web| title=crowdsourcing (noun)| url=https://www.merriam-webster.com/dictionary/crowdsourcing| publisher=Merriam-Webster| date=2024| access-date=3 January 2024}}
Daren C. Brabham defined crowdsourcing as an "online, distributed problem-solving and production model." Kristen L. Guth and Brabham found that the performance of ideas offered in crowdsourcing platforms are affected not only by their quality, but also by the communication among users about the ideas, and presentation in the platform itself.{{Cite journal|last1=Guth|first1=Kristen L.|last2=Brabham|first2=Daren C.| date=2017-08-04|title=Finding the diamond in the rough: Exploring communication and platform in crowdsourcing performance|journal=Communication Monographs| volume=84| issue=4| pages=510–533| doi=10.1080/03637751.2017.1359748| s2cid=54045924}}
Despite the multiplicity of definitions for crowdsourcing, one constant has been the broadcasting of problems to the public, and an open call for contributions to help solve the problem.{{Original research inline|date=September 2022}} Members of the public submit solutions that are then owned by the entity who originally broadcast the problem. In some cases, the contributor of the solution is compensated monetarily with prizes or public recognition. In other cases, the only rewards may be praise or intellectual satisfaction. Crowdsourcing may produce solutions from amateurs or volunteers working in their spare time, from experts, or from small businesses.{{cite news|last=Howe|first=Jeff|year=2006|title=The Rise of Crowdsourcing|magazine=Wired|url=https://www.wired.com/wired/archive/14.06/crowds.html}}
Historical examples
= Timeline of crowdsourcing examples =
- 618–907 – The Tang dynasty of China introduced the joint-stock company, the earliest form of crowdfunding. This was evident during the cold period of the Tang Dynasty when the colder climates resulted in poor harvests and the lessening of agricultural taxes, culminating in the fragmentation of the agricultural sector.{{cite journal |last1=Wei |first1=Zhudeng |last2=Fang |first2=Xiuqi |last3=Yin |first3=Jun |title=Comparison of climatic impacts transmission from temperature to grain harvests and economies between the Han (206 BC–AD 220) and Tang (AD 618–907) dynasties |journal=The Holocene |date=October 2018 |volume=28 |issue=10 |page=1606 |doi=10.1177/0959683618782592 |bibcode=2018Holoc..28.1598W |s2cid=134577720 }} The fragmentation meant that the government had to reform the tax system relying more on the taxation of salt and most importantly business leading to the creation of the Joint-Stock Company.
- 1567 – King Philip II of Spain offered a cash prize for calculating the longitude of a vessel while at sea.{{cite web| title=Longitude and the Académie Royale| author1=O'Connor, J. J.| author2=Robertson, E. F.| url=https://mathshistory.st-andrews.ac.uk/HistTopics/Longitude1| publisher=University of St. Andrews| date=February 1997| access-date=20 January 2024}}
- 1714 – The longitude rewards: When the British government was trying to find a way to measure a ship's longitudinal position, they offered the public a monetary prize to whoever came up with the best solution.{{cite web|url=http://www.crowdsourcing.org/editorial/a-brief-history-of-crowdsourcing-infographic/12532 |title=A Brief History of Crowdsourcing [Infographic] |publisher=Crowdsourcing.org |date=2012-03-18 |access-date=2015-07-02 |archive-url=https://web.archive.org/web/20150703041454/http://www.crowdsourcing.org/editorial/a-brief-history-of-crowdsourcing-infographic/12532 |archive-date=2015-07-03 |url-status=usurped }}{{Cite journal |last1=Cattani |first1=Gino |last2=Ferriani |first2=Simone |last3=Lanza |first3=Andrea |date=December 2017 |title=Deconstructing the Outsider Puzzle: The Legitimation Journey of Novelty |url=https://pubsonline.informs.org/doi/10.1287/orsc.2017.1161 |journal=Organization Science |volume=28 |issue=6 |pages=965–992 |doi=10.1287/orsc.2017.1161 |issn=1047-7039}}
- 1783 – King Louis XVI offered an award to the person who could "make the alkali" by decomposing sea salt by the "simplest and most economic method".
- 1848 – Matthew Fontaine Maury distributed 5000 copies of his Wind and Current Charts free of charge on the condition that sailors returned a standardized log of their voyage to the U.S. Naval Observatory. By 1861, he had distributed 200,000 copies free of charge, on the same conditions.Hern, Chester G.(2002). Tracks in the Sea, p. 123 & 246. McGraw Hill. {{ISBN|0-07-136826-4}}.
- 1849 – A network of some 150 volunteer weather observers all over the USA was set up as a part of the Smithsonian Institution's Meteorological Project started by the Smithsonian's first Secretary, Joseph Henry, who used the telegraph to gather volunteers' data and create a large weather map, making new information available to the public daily. For instance, volunteers tracked a tornado passing through Wisconsin and sent the findings via telegraph to the Smithsonian. Henry's project is considered the origin of what later became the National Weather Service. Within a decade, the project had more than 600 volunteer observers and had spread to Canada, Mexico, Latin America, and the Caribbean.{{cite web|url=https://siarchives.si.edu/blog/smithsonian-crowdsourcing-1849 |title=Smithsonian Crowdsourcing Since 1849 |publisher=Smithsonian Institution Archives |date=2011-04-14 |access-date=2018-08-24}}
- 1884 – Publication of the Oxford English Dictionary: 800 volunteers catalogued words to create the first fascicle of the OED.
- 1916 – Planters Peanuts contest: The Mr. Peanut logo was designed by a 14-year-old boy who won the Planter Peanuts logo contest.
- 1957 – Jørn Utzon was selected as winner of the design competition for the Sydney Opera House.
- 1970 – French amateur photo contest C'était Paris en 1970 ("This Was Paris in 1970") was sponsored by the city of Paris, France-Inter radio, and the Fnac: 14,000 photographers produced 70,000 black-and-white prints and 30,000 color slides of the French capital to document the architectural changes of Paris. Photographs were donated to the Bibliothèque historique de la ville de Paris.{{Cite journal|url=http://etudesphotographiques.revues.org/3407 |title='C'était Paris en 1970' |journal=Études Photographiques |issue=31 |date=1970-04-25 |access-date=2015-07-02|last1=Clark |first1=Catherine E. }}
- 1979 – Robert Axelrod invited academics on-line to submit FORTRAN algorithms to play the repeated Prisoner's Dilemma; A tit for tat algorithm ended up in first place.{{Citation |last1=Axelrod R. |date=1980 |title='Effective choice in the Prisoner's Dilemma' |journal=Journal of Conflict Resolution |volume=24 |issue=1 |pages=3–25 |doi=10.1177/002200278002400101 |s2cid=143112198 }}
- 1983 – Richard Stallman began work on the GNU operating system. Programmers fromaround the world contribute to the GNU operating system. Linux kernel is one of the kernels used in this operating system, thus forming the GNU/Linux operating system, which many people call as Linux.
- 1996 – The Hollywood Stock Exchange was founded: It allowed buying and selling of shares.
- 1997 – British rock band Marillion raised $60,000 from their fans to help finance their U.S. tour.
- 1999 – SETI@home was launched by the University of California, Berkeley. Volunteers can contribute to searching for signals that might come from extraterrestrial intelligence by installing a program that uses idle computer time for analyzing chunks of data recorded by radio telescopes involved in the SERENDIP program.{{cite web| title=SETI@home| url=https://setiathome.berkeley.edu| publisher=University of California| access-date=20 January 2024}}
- 1999– The U.S. Geological Survey's (USGS's) "Did You Feel It?" website was used in the US as a method where by residents could report any tremors or shocks they felt from a recent earthquake and the approximate magnitude of the earthquake.{{Cite journal |last1=Brabham |first1=Daren C. |last2=Ribisl |first2=Kurt M. |last3=Kirchner |first3=Thomas R. |last4=Bernhardt |first4=Jay M. |date=2014-02-01 |title=Crowdsourcing Applications for Public Health |journal=American Journal of Preventive Medicine |volume=46 |issue=2 |pages=179–187 |doi=10.1016/j.amepre.2013.10.016 |pmid=24439353 |s2cid=205436420}}
- 2000 – JustGiving was established: This online platform allows the public to help raise money for charities.
- 2000 – UNV Online Volunteering service launched: Connecting people who commit their time and skills over the Internet to help organizations address development challenges.{{cite web |url=https://www.onlinevolunteering.org/en/org/about/history.html |title=UNV Online Volunteering Service | History |publisher=Onlinevolunteering.org |access-date=2015-07-02 |archive-url=https://web.archive.org/web/20150702145948/https://www.onlinevolunteering.org/en/org/about/history.html |archive-date=2015-07-02 |url-status=dead }}
- 2000 – iStockPhoto was founded: The free stock imagery website allows the public to contribute to and receive commission for their contributions.{{cite news|url=http://archive.wired.com/wired/archive/14.06/crowds.html |title=Wired 14.06: The Rise of Crowdsourcing |publisher=Archive.wired.com |date=2009-01-04 |access-date=2015-07-02}}
- 2001 – Launch of Wikipedia: "Free-access, free content Internet encyclopedia".{{cite book|url=https://archive.org/details/wikipediarevolut00liha|title=The Wikipedia revolution: how a bunch of nobodies created the world's greatest encyclopedia|last1=Lih|first1=Andrew|date=2009|publisher=Hyperion|isbn=978-1401303716|edition=1st|location=New York|url-access=registration}}
- 2001 – Foundation of Topcoder – crowdsourcing software development company.{{Cite journal|vauthors=Lakhani KR, Garvin DA, Lonstein E|date=January 2010|title=TopCoder (A): Developing Software through Crowdsourcing|url=https://www.hbs.edu/faculty/Pages/item.aspx?num=38356|journal=Harvard Business School Case|pages=610–032}}{{Cite news|url=https://timesofindia.indiatimes.com/deals/-ma/Appirios-TopCoder-too-is-a-big-catch-for-Wipro/articleshow/54970568.cms|title=Appirio's TopCoder too is a big catch for Wipro|last=Phadnisi|first=Shilpa|date=21 October 2016|work=The Times of India|access-date=30 April 2018}}
- 2004 – OpenStreetMap, a collaborative project to create a free editable map of the world, was launched.{{cite web| title=For The Love Of Open Mapping Data| author=Lardinois, F.| url=https://techcrunch.com/2014/08/09/for-the-love-of-open-mapping-data| publisher=Yahoo| date=9 August 2014| access-date=20 January 2024}}{{Cite journal |last1=Nagaraj |first1=Abhishek |last2=Piezunka |first2=Henning |date=September 2024 |title=The Divergent Effect of Competition on Platforms: Deterring Recruits, Motivating Converts |url=https://pubsonline.informs.org/doi/10.1287/stsc.2022.0125 |journal=Strategy Science |volume=9 |issue=3 |pages=277–296 |doi=10.1287/stsc.2022.0125 |issn=2333-2050|url-access=subscription }}
- 2004 – Toyota's first "Dream car art" contest: Children were asked globally to draw their "dream car of the future".{{cite web|url=http://www.tiki-toki.com/timeline/entry/323158/Crowdsourcing-Back-Up-Timeline-Early-Stories/ |title=Crowdsourcing Back-Up Timeline Early Stories |archiveurl=https://web.archive.org/web/20141129054631/http://www.tiki-toki.com/timeline/entry/323158/Crowdsourcing-Back-Up-Timeline-Early-Stories/|archivedate=29 November 2014}}{{better source needed|date=November 2021}}
- 2005 – Kodak's "Go for the Gold" contest: Kodak asked anyone to submit a picture of a personal victory.
- 2005 – Amazon Mechanical Turk (MTurk) was launched publicly on November 2, 2005. It enables businesses to hire remotely located "crowdworkers" to perform discrete on-demand tasks that computers are currently unable to do.{{Cite web |title=Amazon Mechanical Turk |url=https://www.mturk.com/worker/help |access-date=2022-11-25 |website=www.mturk.com}}
- 2005 – Reddit was launched in 2005.{{cite web| title=reddit on June23-05| author=Ohanian, A.| url=https://www.flickr.com/photos/33809408@N00/315068778/in/photostream| website=Flickr| date=December 5, 2006| access-date=20 January 2024}} Reddit is a social media platform and online community where users can submit, discuss and vote, leading to diverse discussions and interactions.
- 2009 – Waze (then named FreeMap Israel), a community-oriented GPS app, was created.{{cite web| title=Waze| url=https://www.waze.com/about| publisher=Waze Mobile| date=2009| access-date=20 January 2024}} It allows users to submit road information and route data based on location, such as reports of car accidents or traffic, and integrates that data into its routing algorithms for all users of the app.
- 2010 - Following the Deepwater Horizon oil spill, BP initiated a crowdsourcing effort called the "Deepwater Horizon Response," inviting external experts and the public to submit innovative ideas and technical solutions for containing and cleaning up the massive oil spill. This initiative aimed to leverage collective intelligence to address the unprecedented environmental disaster.{{Cite journal |last1=Piezunka |first1=Henning |last2=Dahlander |first2=Linus |date=June 2015 |title=Distant Search, Narrow Attention: How Crowding Alters Organizations' Filtering of Suggestions in Crowdsourcing |url=https://journals.aom.org/doi/10.5465/amj.2012.0458 |journal=Academy of Management Journal |volume=58 |issue=3 |pages=856–880 |doi=10.5465/amj.2012.0458 |issn=0001-4273|url-access=subscription }}
- 2010 – The 1947 Partition Archive, an oral history project that asked community members around the world to document oral histories from aging witnesses of a significant but under-documented historical event, the 1947 Partition of India, was founded.{{cite news| title=Potent Memories From a Divided India| author=Sengupta, S.| url=https://www.nytimes.com/2013/08/14/arts/potent-memories-from-a-divided-india.html?_r=0| work=New York Times| date=13 August 2013| access-date=20 January 2024}}
- 2011 – Casting of Flavours (Do us a flavor in the USA) – a campaign launched by PepsiCo's Lay's in Spain. The campaign was to create a new flavor for the snack where the consumers were directly involved in its formation.{{cite book |last1=Garrigos-Simon |first1=Fernando J. |last2=Gil-Pechuán |first2=Ignacio |last3=Estelles-Miguel |first3=Sofia |title=Advances in Crowdsourcing |date=2015 |publisher=Springer |isbn=9783319183411 |url=https://books.google.com/books?id=WrclCQAAQBAJ&q=pepsico+2012+do+me+a+flavor+crowdsourcing&pg=PA154 }}
- 2012 - Open Food Facts, a collaborative project to create a libre encyclopedia of food products in the world using smartphones, is launched, followed by extensions on cosmetics, pet food, other products and prices.
=Early competitions=
Crowdsourcing has often been used in the past as a competition to discover a solution. The French government proposed several of these competitions, often rewarded with Montyon Prizes.{{cite web|url=http://www.newadvent.org/cathen/10552a.htm |title=Antoine-Jean-Baptiste-Robert Auget, Baron de Montyon |website=New Advent | access-date=25 February 2012}} These included the Leblanc process, or the Alkali prize, where a reward was provided for separating the salt from the alkali, and the Fourneyron's turbine, when the first hydraulic commercial turbine was developed.{{cite web|url=http://pubs.acs.org/subscribe/archive/tcaw/11/i01/html/01chemchron.html |title=It Was All About Alkali |publisher=Chemistry Chronicles | access-date=25 February 2012}}
In response to a challenge from the French government, Nicolas Appert won a prize for inventing a new way of food preservation that involved sealing food in air-tight jars.{{cite web |url= http://www.brooklyn.cuny.edu/bc/ahp/MBG/MBG4/Appert.html |title=Nicolas Appert |publisher=John Blamire | access-date=25 February 2012}} The British government provided a similar reward to find an easy way to determine a ship's longitude in the Longitude Prize. During the Great Depression, out-of-work clerks tabulated higher mathematical functions in the Mathematical Tables Project as an outreach project.{{cite news|url=http://memeburn.com/2011/09/9-examples-of-crowdsourcing-before-%E2%80%98crowdsourcing%E2%80%99-existed/|title=9 Examples of Crowdsourcing, Before 'Crowdsourcing' Existed |newspaper=MemeBurn | access-date=25 February 2012|date=2011-09-15 }}{{Unreliable source?|date=September 2022}} One of the largest crowdsourcing campaigns was a public design contest in 2010 hosted by the Indian government's finance ministry to create a symbol for the Indian rupee. Thousands of people sent in entries before the government zeroed in on the final symbol based on the Devanagari script using the letter Ra.{{cite web | last = Pande | first = Shamni | title = The People Know Best| url = http://businesstoday.intoday.in/story/crowdsourcing-is-the-new-buzzword-in-communications/1/195160.html/| location= India| work= Business Today| date = 25 May 2013 | publisher= Living Media India Limited}}
Applications
{{See also|List of crowdsourcing projects}}
A number of motivations exist for businesses to use crowdsourcing to accomplish their tasks. These include the ability to offload peak demand, access cheap labor and information, generate better results, access a wider array of talent than what is present in one organization, and undertake problems that would have been too difficult to solve internally.{{Citation |last1=Noveck |first1=Beth Simone |title=Wiki Government: How Technology Can Make Government Better, Democracy Stronger, and Citizens More Powerful |year=2009 |publisher=Brookings Institution Press}} Crowdsourcing allows businesses to submit problems on which contributors can work—on topics such as science, manufacturing, biotech, and medicine—optionally with monetary rewards for successful solutions. Although crowdsourcing complicated tasks can be difficult, simple work tasks{{Specify|date=September 2022}} can be crowdsourced cheaply and effectively.{{Citation |last1=Sarasua |first1=Cristina |title=Crowdsourcing Ontology Alignment with Microtasks |url=https://web.stanford.edu/~natalya/papers/iswc2012_crowdmap.pdf |journal=Institute AIFB. Karlsruhe Institute of Technology |page=2 |year=2012 |last2=Simperl |first2=Elena |last3=Noy |first3=Natalya F. |access-date=18 September 2021 |archive-date=5 March 2016 |archive-url=https://web.archive.org/web/20160305071957/http://web.stanford.edu/~natalya/papers/iswc2012_crowdmap.pdf |url-status=dead }}
Crowdsourcing also has the potential to be a problem-solving mechanism for government and nonprofit use.{{Cite journal |last1=Hollow |first1=Matthew |date=20 April 2013 |title=Crowdfunding and Civic Society in Europe: A Profitable Partnership? |url=https://www.academia.edu/3415172 |journal=Open Citizenship |access-date=29 April 2013}} Urban and transit planning are prime areas for crowdsourcing. For example, from 2008 to 2009, a crowdsourcing project for transit planning in Salt Lake City was created to test the public participation process.
{{Citation |title=Federal Transit Administration Public Transportation Participation Pilot Program |url=http://www.fta.dot.gov./planning/programs/planning_environment_8711.html |archive-url=https://web.archive.org/web/20090107140521/http://www.fta.dot.gov./planning/programs/planning_environment_8711.html |publisher=U.S. Department of Transportation |archive-date=7 January 2009 |url-status=dead}} Another notable application of crowdsourcing for government problem-solving is Peer-to-Patent, which was an initiative to improve patent quality in the United States through gathering public input in a structured, productive manner.{{Citation |title=Peer-to-Patent Community Patent Review Project |url=http://www.peertopatent.org/ |publisher=Peer-to-Patent Community Patent Review Project}}
Researchers have used crowdsourcing systems such as Amazon Mechanical Turk or CloudResearch to aid their research projects by crowdsourcing some aspects of the research process, such as data collection, parsing, and evaluation to the public. Notable examples include using the crowd to create speech and language databases,{{Citation |last1=Callison-Burch |first1=C. |title=Creating Speech and Language Data With Amazon's Mechanical Turk |url=http://www.aclweb.org/anthology-new/W/W10/W10-0701.pdf |journal=Human Language Technologies Conference |pages=1–12 |year=2010 |archive-url=https://web.archive.org/web/20120802162113/http://www.aclweb.org/anthology-new/W/W10/W10-0701.pdf |access-date=2012-02-28 |archive-date=2012-08-02 |last2=Dredze |first2=M. |url-status=dead}}{{Citation |last1=McGraw |first1=I. |url=http://people.csail.mit.edu/jrg/2011/McGraw_Interspeech11.pdf |pages=3057–3060 |year=2011 |doi=10.21437/Interspeech.2011-765 |last2=Seneff |first2=S.|title=Interspeech 2011 |chapter=Growing a spoken language interface on Amazon Mechanical Turk }} to conduct user studies, and to run behavioral science surveys and experiments.{{Cite book |last1=Litman |first1=Leib |url=https://www.amazon.com/Conducting-Research-Mechanical-Innovations-Methods-ebook/dp/B086QTWBNC |title=Conducting Online Research on Amazon Mechanical Turk and Beyond. |last2=Robinson |first2=Jonathan |publisher=SAGE Publications |year=2020 |isbn=978-1506391137}} Crowdsourcing systems provided researchers with the ability to gather large amounts of data, and helped researchers to collect data from populations and demographics they may not have access to locally.{{Citation |last1=Mason |first1=W. |title=Conducting Behavioral Research on Amazon's Mechanical Turk |journal=Behavior Research Methods |year=2010 |ssrn=1691163 |last2=Suri |first2=S.}}{{Failed verification|date=September 2022}}
Artists have also used crowdsourcing systems. In a project called the Sheep Market, Aaron Koblin used Mechanical Turk to collect 10,000 drawings of sheep from contributors around the world.{{Cite book|last1=Koblin |first1=A. |title=Proceedings of the seventh ACM conference on Creativity and cognition |chapter=The sheep market |date=2009 |pages=451–452 |doi=10.1145/1640233.1640348 |isbn=9781605588650 |s2cid=20609292}} Artist Sam Brown leveraged the crowd by asking visitors of his website explodingdog to send him sentences to use as inspirations for his paintings.{{cite web |title=explodingdog 2015 |url=http://www.explodingdog.com/ |access-date=2015-07-02 |publisher=Explodingdog.com}} Art curator Andrea Grover argues that individuals tend to be more open in crowdsourced projects because they are not being physically judged or scrutinized.
{{cite news |last=DeVun |first=Leah |date=19 November 2009 |title=Looking at how crowds produce and present art. |newspaper=Wired News |url=https://www.wired.com/techbiz/media/news/2007/07/crowd_captain?currentPage=all |access-date=26 February 2012 |archive-url=https://web.archive.org/web/20121024130503/http://www.wired.com/techbiz/media/news/2007/07/crowd_captain?currentPage=all |archive-date=2012-10-24}} As with other types of uses, artists use crowdsourcing systems to generate and collect data. The crowd also can be used to provide inspiration and to collect financial support for an artist's work.{{Citation |last1=Linver |first1=D. |title=Crowdsourcing and the Evolving Relationship between Art and Artist |url=http://www.crowdsourcing.org/document/crowdsourcing-and-the-evolving-relationship-between-artist-and-audience/5515 |year=2010 |archive-url=https://web.archive.org/web/20140714163540/http://www.crowdsourcing.org/document/crowdsourcing-and-the-evolving-relationship-between-artist-and-audience/5515 |access-date=2012-02-28 |archive-date=2014-07-14 |url-status=usurped}}
In navigation systems, crowdsourcing from 100 million drivers were used by INRIX to collect users' driving times to provide better GPS routing and real-time traffic updates.{{cite web |date=2014-09-13 |title=Why |url=http://www.inrix.com/companyoverview.asp |url-status=dead |archive-url=https://web.archive.org/web/20141012000923/http://www.inrix.com/companyoverview.asp |archive-date=2014-10-12 |access-date=2015-07-02 |publisher=INRIX.com}}
= In healthcare =
The use of crowdsourcing in medical and health research is increasing systematically. The process involves outsourcing tasks or gathering input from a large, diverse groups of people, often facilitated through digital platforms, to contribute to medical research, diagnostics, data analysis, promotion, and various healthcare-related initiatives. Usage of this innovative approach supplies a useful community-based method to improve medical services.
From funding individual medical cases and innovative devices to supporting research, community health initiatives, and crisis responses, crowdsourcing proves its versatile impact in addressing diverse healthcare challenges.{{Cite journal |last1=Wang |first1=Cheng |last2=Han |first2=Larry |last3=Stein |first3=Gabriella |last4=Day |first4=Suzanne |last5=Bien-Gund |first5=Cedric |last6=Mathews |first6=Allison |last7=Ong |first7=Jason J. |last8=Zhao |first8=Pei-Zhen |last9=Wei |first9=Shu-Fang |last10=Walker |first10=Jennifer |last11=Chou |first11=Roger |last12=Lee |first12=Amy |last13=Chen |first13=Angela |last14=Bayus |first14=Barry |last15=Tucker |first15=Joseph D. |date=2020-01-20 |title=Crowdsourcing in health and medical research: a systematic review |journal=Infectious Diseases of Poverty |volume=9 |issue=1 |pages=8 |doi=10.1186/s40249-020-0622-9 |doi-access=free |issn=2049-9957 |pmc=6971908 |pmid=31959234}}
In 2011, UNAIDS initiated the participatory online policy project to better engage young people in decision-making processes related to AIDS.{{Cite journal |last1=Hildebrand |first1=Mikaela |last2=Ahumada |first2=Claudia |last3=Watson |first3=Sharon |date=January 2013 |title=CrowdOutAIDS: crowdsourcing youth perspectives for action |url=https://www.tandfonline.com/doi/full/10.1016/S0968-8080%2813%2941687-7 |journal=Reproductive Health Matters |language=en |volume=21 |issue=41 |pages=57–68 |doi=10.1016/S0968-8080(13)41687-7 |pmid=23684188 |s2cid=31888826 |issn=0968-8080|url-access=subscription }} The project acquired data from 3,497 participants across seventy-nine countries through online and offline forums. The outcomes generally emphasized the importance of youth perspectives in shaping strategies to effectively address AIDS which provided a valuable insight for future community empowerment initiatives.
Another approach is sourcing results of clinical algorithms from collective input of participants.{{Cite journal |last1=Feng |first1=Steve |last2=Woo |first2=Min-jae |last3=Kim |first3=Hannah |last4=Kim |first4=Eunso |last5=Ki |first5=Sojung |last6=Shao |first6=Lei |last7=Ozcan |first7=Aydogan |editor-first1=David |editor-first2=Aydogan |editor-first3=David |editor-last1=Levitz |editor-last2=Ozcan |editor-last3=Erickson |date=2016-03-11 |title=A game-based crowdsourcing platform for rapidly training middle and high school students to perform biomedical image analysis |url=https://www.spiedigitallibrary.org/conference-proceedings-of-spie/9699/96990T/A-game-based-crowdsourcing-platform-for-rapidly-training-middle-and/10.1117/12.2212310.full |journal=Optics and Biophotonics in Low-Resource Settings II |publisher=SPIE |volume=9699 |pages=92–100 |doi=10.1117/12.2212310|bibcode=2016SPIE.9699E..0TF |s2cid=124343732 |url-access=subscription }} Researchers from SPIE developed a crowdsourcing tool, to train individuals, especially middle and high school students in South Korea, to diagnose malaria-infected red blood cells. Using a statistical framework, the platform combined expert diagnoses with those from minimally trained individuals, creating a gold standard library. The objective was to swiftly teach people to achieve great diagnosis accuracy without any prior training.
Cancer medicine journal conducted a review of the studies published between January 2005 and June 2016 on crowdsourcing in cancer research, with the usage PubMed, CINAHL, Scopus, PsychINFO, and Embase.{{Cite journal |last1=Lee |first1=Young Ji |last2=Arida |first2=Janet A. |last3=Donovan |first3=Heidi S. |date=November 2017 |title=The application of crowdsourcing approaches to cancer research: a systematic review |journal=Cancer Medicine |language=en |volume=6 |issue=11 |pages=2595–2605 |doi=10.1002/cam4.1165 |issn=2045-7634 |pmc=5673951 |pmid=28960834}} All of them strongly advocate for continuous efforts to refine and expand crowdsourcing applications in academic scholarship. Analysis highlighted the importance of interdisciplinary collaborations and widespread dissemination of knowledge; the review underscored the need to fully harness crowdsourcing's potential to address challenges within cancer research.
=In science=
== Astronomy ==
Crowdsourcing in astronomy was used in the early 19th century by astronomer Denison Olmsted. After being awakened in a late November night due to a meteor shower taking place, Olmsted noticed a pattern in the shooting stars. Olmsted wrote a brief report of this meteor shower in the local newspaper. "As the cause of 'Falling Stars' is not understood by meteorologists, it is desirable to collect all the facts attending this phenomenon, stated with as much precision as possible", Olmsted wrote to readers, in a report subsequently picked up and pooled to newspapers nationwide. Responses came pouring in from many states, along with scientists' observations sent to the American Journal of Science and Arts.{{cite web|last1=Vergano|first1=Dan|title=1833 Meteor Storm Started Citizen Science|url=http://newswatch.nationalgeographic.com/2014/08/30/1833-meteor-storm-started-citizen-science/|archive-url=https://web.archive.org/web/20140916020609/http://newswatch.nationalgeographic.com/2014/08/30/1833-meteor-storm-started-citizen-science/|url-status=dead|archive-date=16 September 2014|website=National Geographic|publisher=StarStruck|access-date=18 September 2014|date=2014-08-30}} These responses helped him to make a series of scientific breakthroughs including observing the fact that meteor showers are seen nationwide and fall from space under the influence of gravity. The responses also allowed him to approximate a velocity for the meteors.{{cite journal |last1=Littmann |first1=Mark |last2=Suomela |first2=Todd |title=Crowdsourcing, the great meteor storm of 1833, and the founding of meteor science |journal=Endeavour |date=June 2014 |volume=38 |issue=2 |pages=130–138 |doi=10.1016/j.endeavour.2014.03.002 |pmid=24917173 }}
A more recent version of crowdsourcing in astronomy is NASA's photo organizing project,{{cite web|url=http://eol.jsc.nasa.gov/|title=Gateway to Astronaut Photography of Earth|publisher=NASA}} which asked internet users to browse photos taken from space and try to identify the location the picture is documenting.{{cite news|last1=McLaughlin|first1=Elliot|title=Image Overload: Help us sort it all out, NASA requests|url=http://www.cnn.com/2014/08/17/tech/nasa-earth-images-help-needed/ |publisher=CNN|access-date=18 September 2014}}
Behavioral science
In the field of behavioral science, crowdsourcing is often used to gather data and insights on human behavior and decision making. Researchers may create online surveys or experiments that are completed by a large number of participants, allowing them to collect a diverse and potentially large amount of data. Crowdsourcing can also be used to gather real-time data on behavior, such as through the use of mobile apps that track and record users' activities and decision making.{{Cite journal |last1=Liu |first1=Huiying |last2=Xie |first2=Qian Wen |last3=Lou |first3=Vivian W. Q. |date=2019-04-01 |title=Everyday social interactions and intra-individual variability in affect: A systematic review and meta-analysis of ecological momentary assessment studies |journal=Motivation and Emotion |volume=43 |issue=2 |pages=339–353 |doi=10.1007/s11031-018-9735-x |s2cid=254827087 }} The use of crowdsourcing in behavioral science has the potential to greatly increase the scope and efficiency of research, and has been used in studies on topics such as psychology research,{{Cite journal |last1=Luong |first1=Raymond |last2=Lomanowska |first2=Anna M. |date=2021 |title=Evaluating Reddit as a Crowdsourcing Platform for Psychology Research Projects |journal=Teaching of Psychology |volume=49 |issue=4 |pages=329–337 |doi=10.1177/00986283211020739 |s2cid=236414676 |doi-access=free }} political attitudes,{{Cite journal |last1=Brown |first1=Joshua K. |last2=Hohman |first2=Zachary P. |date=2022 |title=Extreme party animals: Effects of political identification and ideological extremity |journal=Journal of Applied Social Psychology |volume=52 |issue=5 |pages=351–362 |doi=10.1111/jasp.12863 |s2cid=247077069 }} and social media use.{{Cite journal |last1=Vaterlaus |first1=J. Mitchell |last2=Patten |first2=Emily V. |last3=Spruance |first3=Lori A. |date=2022-05-26 |title=#Alonetogether:: An Exploratory Study of Social Media Use at the Beginning of the COVID-19 Pandemic |url=https://thejsms.org/index.php/JSMS/article/view/887 |journal=The Journal of Social Media in Society |volume=11 |issue=1 |pages=27–45}}
== Energy system research ==
Energy system models require large and diverse datasets, increasingly so given the trend towards greater temporal and spatial resolution.
{{cite journal
| last1 = Després | first1 = Jacques
| last2 = Hadjsaid | first2 = Nouredine
| last3 = Criqui | first3 = Patrick
| last4 = Noirot | first4 = Isabelle
| title = Modelling the impacts of variable renewable sources on the power sector: reconsidering the typology of energy modelling tools
| date = 1 February 2015
| journal = Energy
| volume = 80
| pages = 486–495
| doi = 10.1016/j.energy.2014.12.005
| bibcode = 2015Ene....80..486D
}}
In response, there have been several initiatives to crowdsource this data. Launched in December 2009, OpenEI is a collaborative website run by the US government that provides open energy data.
{{cite web
| title = OpenEI — Energy Information, Data, and other Resources
| website = OpenEI
| url = http://en.openei.org
| access-date = 2016-09-26
}}
| first = Peggy
| last = Garvin
| title = New Gateway: Open Energy Info
| work = SLA Government Information Division
| location = Dayton, Ohio, USA
| access-date = 2016-09-26
| date = 12 December 2009
| url = http://govinfo.sla.org/2009/12/12/new-gateway-open-energy-info/
}}{{Dead link|date=November 2019 |bot=InternetArchiveBot |fix-attempted=yes }} While much of its information is from US government sources, the platform also seeks crowdsourced input from around the world.{{cite book
| last = Brodt-Giles
| first = Debbie
| title = WREF 2012: OpenEI — an open energy data and information exchange for international audiences
| date = 2012
| publisher = National Renewable Energy Laboratory (NREL)
| location = Golden, Colorado, USA
| url = https://ases.conference-services.net/resources/252/2859/pdf/SOLAR2012_0677_full%20paper.pdf
| archive-url = https://web.archive.org/web/20161009172347/https://ases.conference-services.net/resources/252/2859/pdf/SOLAR2012_0677_full%20paper.pdf
| url-status = dead
| archive-date = 9 October 2016
| access-date = 2016-09-24
}} The semantic wiki and database Enipedia also publishes energy systems data using the concept of crowdsourced open information. Enipedia went live in March 2011.{{cite web
| first1 = Chris
| last1 = Davis
| first2 = Alfredas
| last2 = Chmieliauskas
| first3 = Gerard
| last3 = Dijkema
| first4 = Igor
| last4 = Nikolic
| title = Enipedia
| publisher = Energy and Industry group, Faculty of Technology, Policy and Management, TU Delft
| location = Delft, The Netherlands
| url = http://enipedia.tudelft.nl
| archive-url = https://archive.today/20140610231532/http://enipedia.tudelft.nl/
| url-status = usurped
| archive-date = 2014-06-10
| access-date = 2016-10-07
== Genealogy research ==
Genealogical research used crowdsourcing techniques long before personal computers were common. Beginning in 1942, members of the Church of Jesus Christ of Latter-day Saints encouraged members to submit information about their ancestors. The submitted information was gathered together into a single collection. In 1969, to encourage more participation, the church started the three-generation program. In this program, church members were asked to prepare documented family group record forms for the first three generations. The program was later expanded to encourage members to research at least four generations and became known as the four-generation program.{{cite web|url=https://www.churchofjesuschrist.org/study/ensign/1972/03/what-is-the-four-generation-program?lang=eng |title=What Is the Four-Generation Program? |publisher=The Church of Jesus Christ of Latter-day Saints | access-date=30 January 2012}}
Institutes that have records of interest to genealogical research have used crowds of volunteers to create catalogs and indices to records.{{Citation needed|date=September 2022}}
Genetic genealogy research
Genetic genealogy is a combination of traditional genealogy with genetics. The rise of personal DNA testing, after the turn of the century, by companies such as Gene by Gene, FTDNA, GeneTree, 23andMe, and Ancestry.com, has led to public and semi public databases of DNA testing using crowdsourcing techniques. Citizen science projects have included support, organization, and dissemination of personal DNA (genetic) testing. Similar to amateur astronomy, citizen scientists encouraged by volunteer organizations like the International Society of Genetic Genealogy{{cite journal|last1=King|doi=10.1016/j.tig.2009.06.003|quote=The International Society of Genetic Genealogy advocates the use of genetics as a tool for genealogical research, and provides a support network for genetic genealogists. It hosts the ISOGG Y-haplogroup tree, which has the virtue of being regularly updated.|title=What's in a name? Y chromosomes, surnames and the genetic genealogy revolution|year=2009|first1=Turi E.|last2=Jobling|first2=Mark A.|journal=Trends in Genetics|volume=25|issue=8|pages=351–60|pmid=19665817|hdl=2381/8106|url=https://figshare.com/articles/journal_contribution/10096019|hdl-access=free}} have provided valuable information and research to the professional scientific community.{{cite journal |last = Mendex, etc. al. |first = Fernando |title = An African American Paternal Lineage Adds an Extremely Ancient Root to the Human Y Chromosome Phylogenetic Tree |date=28 February 2013 |url= |doi=10.1016/j.ajhg.2013.02.002 |volume=92 |issue = 3 |journal=The American Journal of Human Genetics |pages=454–459 |pmid=23453668 |pmc=3591855}} The Genographic Project, which began in 2005, is a research project carried out by the National Geographic Society's scientific team to reveal patterns of human migration using crowdsourced DNA testing and reporting of results.{{cite web |last =Wells |first = Spencer |title =The Genographic Project and the Rise of Citizen Science |publisher = Southern California Genealogical Society (SCGS)|year = 2013|url = http://www.scgsgenealogy.com/Jamboree/2013/DNAday.htm|archive-url = https://web.archive.org/web/20130710014353/http://www.scgsgenealogy.com/Jamboree/2013/DNAday.htm|archive-date = 2013-07-10 |access-date = July 10, 2013}}
== Ornithology ==
Another early example of crowdsourcing occurred in the field of ornithology. On 25 December 1900, Frank Chapman, an early officer of the National Audubon Society, initiated a tradition dubbed the "Christmas Day Bird Census". The project called birders from across North America to count and record the number of birds in each species they witnessed on Christmas Day. The project was successful, and the records from 27 different contributors were compiled into one bird census, which tallied around 90 species of birds.{{cite web|date=2015-01-22|title=History of the Christmas Bird Count | Audubon|url=http://birds.audubon.org/history-christmas-bird-count|access-date=2015-07-02|publisher=Birds.audubon.org}} This large-scale collection of data constituted an early form of citizen science, the premise upon which crowdsourcing is based. In the 2012 census, more than 70,000 individuals participated across 2,369 bird count circles.{{Cite web|date=5 October 2017|title=Thank you!|url=https://www.audubon.org/thank-you-0|url-status=dead|archive-url=https://web.archive.org/web/20140824051327/http://www.audubon.org/thank-you-0|archive-date=24 August 2014|website=Audubon}} Christmas 2014 marked the National Audubon Society's 115th annual Christmas Bird Count.
== [[Seismology]] ==
The European-Mediterranean Seismological Centre (EMSC) has developed a seismic detection system by monitoring the traffic peaks on its website and analyzing keywords used on Twitter.{{cite web|title=Home – ISCRAM2015 – University of Agder|url=http://iscram2015.uia.no/wp-content/uploads/2015/05/8-9.pdf|url-status=dead|archive-url=https://web.archive.org/web/20161017122308/http://iscram2015.uia.no/wp-content/uploads/2015/05/8-9.pdf|archive-date=2016-10-17|access-date=2016-10-14|website=iscram2015.uia.no}}
= In journalism =
{{See also|Collaborative journalism|Citizen journalism}}
Crowdsourcing is increasingly used in professional journalism. Journalists are able to organize crowdsourced information by fact checking the information, and then using the information they have gathered in their articles as they see fit.{{Citation needed|date=September 2022}} A daily newspaper in Sweden has successfully used crowdsourcing in investigating the home loan interest rates in the country in 2013–2014, which resulted in over 50,000 submissions.{{Cite journal|last=Aitamurto|first=Tanja|year=2015|title=Motivation Factors in Crowdsourced Journalism: Social Impact, Social Change and Peer-Learning|url=http://crowdsourcinginjournalism.com/2015/10/28/motivation-factors-in-crowdsourced-journalism-social-impact-social-change-and-peer-learning/|journal=International Journal of Communication|volume=9|pages=3523–3543}} A daily newspaper in Finland crowdsourced an investigation into stock short-selling in 2011–2012, and the crowdsourced information led to revelations of a tax evasion system by a Finnish bank. The bank executive was fired and policy changes followed.{{cite journal|year=2016|title=Crowdsourcing as a Knowledge-Search Method in Digital Journalism: Ruptured Ideals and Blended Responsibility|url=http://crowdsourcinginjournalism.com/2015/07/04/crowdsourcing-as-a-knowledge-search-method-in-digital-journalism-ruptured-ideals-and-blended-responsibility/|journal=Digital Journalism|volume=4|issue=2|pages=280–297|doi=10.1080/21670811.2015.1034807|last1=Aitamurto|first1=Tanja|s2cid=156243124|url-access=subscription}} TalkingPointsMemo in the United States asked its readers to examine 3,000 emails concerning the firing of federal prosecutors in 2008. The British newspaper The Guardian crowdsourced the examination of hundreds of thousands of documents in 2009.{{cite journal|last1=Aitamurto|first1=Tanja|title=Balancing between open and closed: co-creation in magazine journalism|journal=Digital Journalism|volume=1|issue=2|doi=10.1080/21670811.2012.750150|pages=229–251|year=2013|s2cid=62882093}}
== Data donation ==
Data donation is a crowdsourcing approach to gather digital data. It is used by researchers and organizations to gain access to data from online platforms, websites, search engines and apps and devices. Data donation projects usually rely on participants volunteering their authentic digital profile information. Examples include:
- DataSkop developed by Algorithm Watch, a non-profit research organization in Germany, which accessed data on social media algorithms and automated decision-making systems.{{Cite web |date=2022 |title=Algorithm Watch |url=https://algorithmwatch.org/en/ |access-date=18 May 2022 |website=Algorithm Watch}}{{Cite web |date=2022 |title=Overview in English |url=https://dataskop.net/overview-in-english/ |access-date=2022-05-18 |website=DataSkop }}
- Mozilla Rally, from the Mozilla Foundation, is a browser extension for adult participants in the US{{cite web | title=FAQs | website=Mozilla Rally | url=https://rally.mozilla.org/how-rally-works/faqs/ | access-date=14 March 2023 | quote=Mozilla Rally is currently available to US residents who are age 19 and older | archive-date=14 March 2023 | archive-url=https://web.archive.org/web/20230314142545/https://rally.mozilla.org/how-rally-works/faqs/ | url-status=dead }} to provide access to their data for research projects.{{Cite web |title=It's your data. Use it for change|url=https://rally.mozilla.org/ |access-date=2023-03-14 |website=Mozilla Rally }}
- The Australian Search Experience and Ad Observatory projects set up in 2021 by researchers at the ARC Centre of Excellence for Automated Decision-Making and Society (ADM+S) in Australia was using data donations to analyze how Google personalized search results, and examine how Facebook's algorithmic advertising model worked.{{Cite web |last=Angus |first=Daniel |date=2022-02-16 |title=A data economy: the case for doing and knowing more about algorithms |url=https://www.crikey.com.au/2022/02/16/data-economy-algorithms/ |access-date=2022-03-24 |website=Crikey }}{{Cite journal |last1=Burgess |first1=Jean |last2=Angus |first2=Daniel |last3=Carah |first3=Nicholas |last4=Andrejevic |first4=Mark |last5=Hawker |first5=Kiah |last6=Lewis |first6=Kelly |last7=Obeid |first7=Abdul |last8=Smith |first8=Adam |last9=Tan |first9=Jane |last10=Fordyce |first10=Robbie |last11=Trott |first11=Verity |date=2021-11-08 |title=Critical simulation as hybrid digital method for exploring the data operations and vernacular cultures of visual social media platforms |url=https://eprints.qut.edu.au/226345/ |journal=SocArXiv |doi=10.31235/osf.io/2cwsu|s2cid=243837581 }}
- The Citizen Browser Project, developed by The Markup, was designed to measure how disinformation traveled across social media platforms over time.{{Cite web |last=The Markup |date=2022 |title=The Citizen Browser Project—Auditing the Algorithms of Disinformation |url=https://themarkup.org/citizen-browser |access-date=2022-05-18 |website=The Markup }}
- Large Emergency Event Digital Information Repository was an effort to create a repository for images and videos from natural disasters, terrorist, and criminal events
= In Social Media =
Crowdsourcing is used in large scale media, such as the community notes system of the X platform. Crowdsourcing on such platforms is thought to be effective in combating partisan misinformation on social media when certain conditions are met.{{Cite journal |last1=Pretus |first1=Clara |last2=Gil-Buitrago |first2=Helena |last3=Cisma |first3=Irene |last4=Hendricks |first4=Rosamunde C. |last5=Lizarazo-Villarreal |first5=Daniela |date=2024-07-16 |title=Scaling crowdsourcing interventions to combat partisan misinformation |url=https://advances.in/psychology/10.56296/aip00018/ |journal=Advances.in/Psychology |language=en |volume=2 |pages=e85592 |doi=10.56296/aip00018 |issn=2976-937X}}{{Cite journal |last1=Allen |first1=Jennifer |last2=Arechar |first2=Antonio A. |last3=Pennycook |first3=Gordon |last4=Rand |first4=David G. |date=2021-09-03 |title=Scaling up fact-checking using the wisdom of crowds |journal=Science Advances |language=en |volume=7 |issue=36 |pages=eabf4393 |doi=10.1126/sciadv.abf4393 |issn=2375-2548 |pmc=8442902 |pmid=34516925|bibcode=2021SciA....7.4393A }} Success may depend on trust in fact-checking sources, the ability to present information that challenges previous beliefs without causing excessive dissonance, and having a sufficiently large and diverse crowd of participants. Effective crowdsourcing interventions must navigate politically polarized environments where trusted sources may be less inclined to provide dissonant opinions. By leveraging network analysis to connect users with neighboring communities outside their ideological echo chambers, crowdsourcing can provide an additional layer of content moderation.
= In public policy =
Crowdsourcing public policy and the production of public services is also referred to as citizen sourcing. While some scholars argue crowdsourcing for this purpose as a policy tool{{Cite journal|last1=Smith|first1=Graham|last2=Richards|first2=Robert C.|last3=Gastil|first3=John|date=2015-05-12|title=The Potential ofParticipediaas a Crowdsourcing Tool for Comparative Analysis of Democratic Innovations|journal=Policy & Internet|volume=7|issue=2|pages=243–262|doi=10.1002/poi3.93|url=http://westminsterresearch.wmin.ac.uk/15138/1/Participedia%20PSA%20Version.pdf}} or a definite means of co-production,{{Cite journal|last=Moon|first=M. Jae|title=Evolution of co-production in the information age: crowdsourcing as a model of web-based co-production in Korea|journal=Policy and Society|volume=37|issue=3|pages=294–309|doi=10.1080/14494035.2017.1376475|year=2018|s2cid=158440300|doi-access=free}} others question that and argue that crowdsourcing should be considered just as a technological enabler that simply increases speed and ease of participation.{{Cite journal|last=Taeihagh|first=Araz|date=2017-11-08|title=Crowdsourcing: a new tool for policy-making?|journal=Policy Sciences|volume=50|issue=4|pages=629–647|doi=10.1007/s11077-017-9303-3|arxiv=1802.03113|s2cid=27696037}} Crowdsourcing can also play a role in democratization.
{{cite book|last1=Diamond|first1=Larry|url=https://books.google.com/books?id=0IN8DwAAQBAJ|title=Democratization|last2=Whittington|first2=Zak|publisher=Oxford University Press|year=2009|isbn=9780198732280|editor1-last=Welzel|editor1-first=Christian|editor1-link=Christian Welzel|edition=2|location=Oxford|publication-date=2018|page=256|chapter=Social Media|quote=Another way that social media can contribute to democratization is by 'crowdsourcing' information. This elicits the knowledge and wisdom of the 'crowd' [...].|author-link1=Larry Diamond|access-date=4 March 2021|editor2-last=Haerpfer|editor2-first=Christian W.|editor3-last=Bernhagen|editor3-first=Patrick|editor4-last=Inglehart|editor4-first=Ronald F.|editor4-link=Ronald Inglehart}}
The first conference focusing on Crowdsourcing for Politics and Policy took place at Oxford University, under the auspices of the Oxford Internet Institute in 2014. Research has emerged since 2012{{Cite book|title=Crowdsourcing for Democracy: New Era In Policy–Making.|last=Aitamurto|first=Tanja|publisher=Committee for the Future, Parliament of Finland|year=2012|isbn=978-951-53-3459-6|pages=10–30}} which focused on the use of crowdsourcing for policy purposes.{{cite web|author1=Prpić, J.|author2=Taeihagh, A.|author3=Melton, J.|date=2014|title=Crowdsourcing the Policy Cycle. Collective Intelligence 2014, MIT Center for Collective Intelligence|url=http://humancomputation.com/ci2014/papers/Active%20Papers%5CPaper%2040.pdf|archive-url=https://web.archive.org/web/20150624044131/http://humancomputation.com/ci2014/papers/Active%20Papers%5CPaper%2040.pdf|url-status=dead|archive-date=2015-06-24|publisher=Humancomputation.com|access-date=2015-07-02}}{{cite web|url=https://www.researchgate.net/publication/262523774|title=A Framework for Policy Crowdsourcing. Oxford Internet Institute, University of Oxford – IPP 2014 – Crowdsourcing for Politics and Policy|author1=Prpić, J.|author2=Taeihagh, A.|date=2014|publisher=Ipp.oxii.ox.ac.uk|format=PDF|access-date=2018-10-02|author3=Melton, J.}} These include experimentally investigating the use of Virtual Labor Markets for policy assessment,{{cite web|author1=Prpić, J.|author2=Taeihagh, A.|author3=Melton, J.|date=2014|title=Experiments on Crowdsourcing Policy Assessment. Oxford Internet Institute, University of Oxford – IPP 2014 – Crowdsourcing for Politics and Policy|url=http://ipp.oii.ox.ac.uk/sites/ipp/files/documents/IPP2014_Taeihagh.pdf|publisher=Ipp.oii.ox.ac.uk|access-date=2015-07-02|archive-date=24 June 2015|archive-url=https://web.archive.org/web/20150624041608/http://ipp.oii.ox.ac.uk/sites/ipp/files/documents/IPP2014_Taeihagh.pdf|url-status=dead}} and assessing the potential for citizen involvement in process innovation for public administration.{{Cite journal|author1=Thapa, B.|author2=Niehaves, B.|author3=Seidel, C.|author4=Plattfaut, R.|title=Citizen involvement in public sector innovation: Government and citizen perspectives | journal = Information Polity | volume = 20 | issue = 1 | pages = 3–17 | date = 2015 | url = http://content.iospress.com/articles/information-polity/ip351 | doi=10.3233/IP-150351| url-access=subscription }}
Governments across the world are increasingly using crowdsourcing for knowledge discovery and civic engagement.{{Citation needed|date=September 2022}} Iceland crowdsourced their constitution reform process in 2011, and Finland has crowdsourced several law reform processes to address their off-road traffic laws. The Finnish government allowed citizens to go on an online forum to discuss problems and possible resolutions regarding some off-road traffic laws.{{Citation needed|date=September 2022}} The crowdsourced information and resolutions would then be passed on to legislators to refer to when making a decision, allowing citizens to contribute to public policy in a more direct manner.{{cite journal|last1=Aitamurto and Landemore|title=Five design principles for crowdsourced policymaking: Assessing the case of crowdsourced off-road traffic law reform in Finland|journal=Journal of Social Media for Organizations|issue=1|pages=1–19|url=http://thefinnishexperiment.com/2015/02/04/design-for-crowdsourced-policy-making/|date=2015-02-04}}{{Cite journal| last1=Aitamurto | first1=Tanja | last2=Landemore | first2=Hélène | last3=Saldivar Galli | first3=Jorge|year=2016|title=Unmasking the Crowd: Participants' Motivation Factors, Profile and Expectations for Participation in Crowdsourced Policymaking|url=http://thefinnishexperiment.com/2016/09/21/motivation-factors-for-participation-in-crowdsourced-policymaking/|journal=Information, Communication & Society|volume=20|issue=8|pages=1239–1260|doi=10.1080/1369118x.2016.1228993|s2cid=151989757|url-access=subscription}} Palo Alto crowdsources feedback for its Comprehensive City Plan update in a process started in 2015.{{Cite book|year=2016|title= Proceedings of the 20th International Academic Mindtrek Conference|chapter= Civic CrowdAnalytics: Making sense of crowdsourced civic input with big data tools|chapter-url=http://thefinnishexperiment.com/2016/10/23/making-sense-of-crowdsourced-civic-input-with-big-data-tools/|via=ACM Digital Archive|doi=10.1145/2994310.2994366|s2cid=16855773|chapter-url-access=subscription|last1= Aitamurto|first1= Tanja|last2= Chen|first2= Kaiping|last3= Cherif|first3= Ahmed|last4= Galli|first4= Jorge Saldivar|last5= Santana|first5= Luis|pages= 86–94|isbn= 978-1-4503-4367-1}} The House of Representatives in Brazil has used crowdsourcing in policy-reforms.
NASA used crowdsourcing to analyze large sets of images. As part of the Open Government Initiative of the Obama Administration, the General Services Administration collected and amalgamated suggestions for improving federal websites.{{cite book|last1=Aitamurto|first1=Tanja|title=Crowdsourcing for Democracy: New Era in Policymaking|publisher=Committee for the Future, Parliament of Finland |isbn=978-951-53-3459-6 |url=http://thefinnishexperiment.com/2015/01/31/crowdsourcing-for-democracy-new-era-in-policy-making/ |date=2015-01-31}}
For part of the Obama and Trump Administrations, the We the People system collected signatures on petitions, which were entitled to an official response from the White House once a certain number had been reached. Several U.S. federal agencies ran inducement prize contests, including NASA and the Environmental Protection Agency.{{cite web |url=http://challenge.gov/ |title=Home |website=challenge.gov}}
= In product design =
Organizations often leverage crowdsourcing to gather ideas for new products as well as for the refinement of established product. Lego allows users to work on new product designs while conducting requirements testing. Any user can provide a design for a product, and other users can vote on the product. Once the submitted product has received 10,000 votes, it will be formally reviewed in stages and go into production with no impediments such as legal flaws identified. The creator receives royalties from the net income.{{Citation |last1=Martin |first1=Fred |title=Lego/Logo and Electronic Bricks: Creating a Scienceland for Children |date=1993 |work=Advanced Educational Technologies for Mathematics and Science |pages=61–89 |place=Berlin, Heidelberg |publisher=Springer Berlin Heidelberg |isbn=978-3-642-08152-1 |last2=Resnick |first2=Mitchel|doi=10.1007/978-3-662-02938-1_2 }} Labelling new products as "customer-ideated" through crowdsourcing initiatives, as opposed to not specifying the source of design, leads to a substantial increase in the actual market performance of the products. Merely highlighting the source of design to customers, particularly, attributing the product to crowdsourcing efforts from user communities, can lead to a significant boost in product sales. Consumers perceive "customer-ideated" products as more effective in addressing their needs, leading to a quality inference. The design mode associated with crowdsourced ideas is considered superior in generating promising new products, contributing to the observed increase in market performance.{{Cite journal |last1=Nishikawa |first1=Hidehiko |last2=Schreier |first2=Martin |last3=Fuchs |first3=Christoph |last4=Ogawa |first4=Susumu |date=August 2017 |title=The Value of Marketing Crowdsourced New Products as Such: Evidence from Two Randomized Field Experiments |url=http://journals.sagepub.com/doi/10.1509/jmr.15.0244 |journal=Journal of Marketing Research |language=en |volume=54 |issue=4 |pages=525–539 |doi=10.1509/jmr.15.0244 |issn=0022-2437|url-access=subscription }}
= In business =
Crowdsourcing is widely used by businesses to source feedback and suggestions on how to improve their products and services. Homeowners can use Airbnb to list their accommodation or unused rooms. Owners set their own nightly, weekly and monthly rates and accommodations. The business, in turn, charges guests and hosts a fee. Guests usually end up spending between $9 and $15.{{Citation |last1=Reinhold |first1=Stephan |title=How Airbnb Creates Value |date=December 2017 |work=Peer-to-Peer Accommodation Networks |publisher=Goodfellow Publishers |last2=Dolnicar |first2=Sara|doi=10.23912/9781911396512-3602 |isbn=9781911396512 }} They have to pay a booking fee every time they book a room. The landlord, in turn, pays a service fee for the amount due. The company has 1,500 properties in 34,000 cities in more than 190 countries.{{Citation needed|date=September 2022}}
= In market research =
Crowdsourcing is frequently used in market research as a way to gather insights and opinions from a large number of consumers.{{Cite web |title=Prime Panels by CloudResearch {{!}} Online Research Panel Recruitment |url=https://www.cloudresearch.com/products/prime-panels/ |access-date=2023-01-12 |website=CloudResearch }} Companies may create online surveys or focus groups that are open to the general public, allowing them to gather a diverse range of perspectives on their products or services. This can be especially useful for companies seeking to understand the needs and preferences of a particular market segment or to gather feedback on the effectiveness of their marketing efforts. The use of crowdsourcing in market research allows companies to quickly and efficiently gather a large amount of data and insights that can inform their business decisions.{{Cite book |author1=Nunan, Daniel |url=https://www.worldcat.org/oclc/1128061550 |title=Marketing research : applied insight |date=2020 |author2=Birks, David F. |author3=Malhotra, Naresh K. |isbn=978-1-292-30872-2 |edition=6th |location=Harlow, United Kingdom |oclc=1128061550|publisher=Pearson}}
= Other examples =
- Geography — Volunteered geographic information (VGI) is geographic information generated through crowdsourcing, as opposed to traditional methods of Professional Geographic Information (PGI).{{Cite journal|last1=Parker|first1=Christopher J.|last2=May|first2=Andrew|last3=Mitchell|first3=Val|date=November 2013|title=The role of VGI and PGI in supporting outdoor activities|url=https://dspace.lboro.ac.uk/2134/10350|journal=Applied Ergonomics|volume=44|issue=6|pages=886–894|doi=10.1016/j.apergo.2012.04.013|pmid=22795180|s2cid=12918341 }} In describing the built environment, VGI has many advantages over PGI, primarily perceived currency,{{Cite journal|last1=Parker|first1=Christopher J.|last2=May|first2=Andrew|last3=Mitchell|first3=Val|date=2014-05-15|title=User-centred design of neogeography: the impact of volunteered geographic information on users' perceptions of online map 'mashups'|url=https://dspace.lboro.ac.uk/2134/23845|journal=Ergonomics|volume=57|issue=7|pages=987–997|doi=10.1080/00140139.2014.909950|pmid=24827070|s2cid=13458260}} accuracy{{Cite journal|last1=Brown|first1=Michael|last2=Sharples|first2=Sarah|last3=Harding|first3=Jenny|last4=Parker|first4=Christopher J.|date=November 2013|title=Usability of Geographic Information: Current challenges and future directions|url=http://eprints.nottingham.ac.uk/2809/1/Brown_et_al_2013_Usabilty_of_Geographic_Information.pdf|journal=Applied Ergonomics|volume=44|issue=6|pages=855–865|doi=10.1016/j.apergo.2012.10.013|pmid=23177775|s2cid=26412254 |access-date=20 August 2019|archive-date=19 July 2018|archive-url=https://web.archive.org/web/20180719082903/http://eprints.nottingham.ac.uk/2809/1/Brown_et_al_2013_Usabilty_of_Geographic_Information.pdf|url-status=dead}} and authority.{{Cite journal|last1=Parker|first1=Christopher J.|last2=May|first2=Andrew|last3=Mitchell|first3=Val|date=August 2012|title=Understanding Design with VGI using an Information Relevance Framework|url=https://dspace.lboro.ac.uk/2134/10349|journal=Transactions in GIS|volume=16|issue=4|pages=545–560|doi=10.1111/j.1467-9671.2012.01302.x|bibcode=2012TrGIS..16..545P |s2cid=20100267}} OpenStreetMap is an example of crowdsourced mapping project.
- Engineering — Many companies are introducing crowdsourcing to grow their engineering capabilities and find solutions to unsolved technical challenges and the need to adopt newest technologies such as 3D printing and the IOT.{{Citation needed|date=September 2022}}
- Libraries, museums and archives — Newspaper text correction at the National Library of Australia was an early, influential example of work with text transcriptions for crowdsourcing in cultural heritage institutions.{{cite journal|last1=Holley|first1=Rose|date=March 2010|title=Crowdsourcing: How and Why Should Libraries Do It?|url=http://www.dlib.org/dlib/march10/holley/03holley.html|journal=D-Lib Magazine|volume=16|issue=3/4|doi=10.1045/march2010-holley|access-date=21 May 2021|doi-access=free}} The Steve Museum project provided a prototype for categorizing artworks.{{cite book|last1=Trant|first1=Jennifer|url=http://conference.archimuse.com/files/trantSteveResearchReport2008.pdf|title=Tagging, Folksonomy and Art Museums: Results of steve.museum's research|date=2009|publisher=Archives & Museum Informatics|access-date=21 May 2021|archive-url=https://web.archive.org/web/20100210192354/http://conference.archimuse.com/files/trantSteveResearchReport2008.pdf|archive-date=2010-02-10|url-status=dead}} Crowdsourcing is used in libraries for OCR corrections on digitized texts, for tagging and for funding, especially in the absence of financial and human means. Volunteers can contribute explicitly with conscious effort or implicitly without being known by turning the text on the raw newspaper image into human corrected digital form.Andro, M. (2018). Digital libraries and crowdsourcing, Wiley / ISTE. {{ISBN|9781786301611}}.
- Agriculture — Crowdsource research also applies to the field of agriculture. Crowdsourcing can be used to help farmers and experts to dentify different types of weeds{{Citation |last1=Rahman |first1=Mahbubur |title=Smartphone-based hierarchical crowdsourcing for weed identification |url=http://dl.acm.org/citation.cfm?id=2784520 |journal=Computers and Electronics in Agriculture |volume=113 |pages=14–23 |year=2015 |doi=10.1016/j.compag.2014.12.012 |access-date=12 August 2015 |last2=Blackwell |first2=Brenna |last3=Banerjee |first3=Nilanjan |last4=Dharmendra |first4=Saraswat|bibcode=2015CEAgr.113...14R |url-access=subscription }} from the fields and also to provide assistance in removing the weeds.
- Cheating in bridge — Boye Brogeland initiated a crowdsourcing investigation of cheating by top-level bridge players that showed several players as guilty, which led to their suspension.{{cite web| title=2015 Cheating Scandal| url=https://bridgewinners.com/article/series/2015-cheating-scandal/| publisher=Bridge Winners| date=2015| access-date=20 January 2024}}
- Open-source software and Crowdsourcing software development have been used extensively in the domain of software development.
- Healthcare — Research has emerged that outlined the use of crowdsourcing techniques in the public health domain.{{Cite journal |last1=Tang |first1=Weiming |last2=Han |first2=Larry |last3=Best |first3=John |last4=Zhang |first4=Ye |last5=Mollan |first5=Katie |last6=Kim |first6=Julie |last7=Liu |first7=Fengying |last8=Hudgens |first8=Michael |last9=Bayus |first9=Barry |date=2016-06-01 |title=Crowdsourcing HIV Test Promotion Videos: A Noninferiority Randomized Controlled Trial in China |journal=Clinical Infectious Diseases |volume=62 |issue=11 |pages=1436–1442 |doi=10.1093/cid/ciw171 |pmc=4872295 |pmid=27129465}}{{Cite journal |last1=Zhang |first1=Ye |last2=Kim |first2=Julie A. |last3=Liu |first3=Fengying |last4=Tso |first4=Lai Sze |last5=Tang |first5=Weiming |last6=Wei |first6=Chongyi |last7=Bayus |first7=Barry L. |last8=Tucker |first8=Joseph D. |date=November 2015 |title=Creative Contributory Contests to Spur Innovation in Sexual Health: 2 Cases and a Guide for Implementation |journal=Sexually Transmitted Diseases |volume=42 |issue=11 |pages=625–628 |doi=10.1097/OLQ.0000000000000349 |pmc=4610177 |pmid=26462186}}{{Cite journal |last=Créquit |first=Perrine |date=2018 |title=Mapping of Crowdsourcing in Health: Systematic Review |journal=Journal of Medical Internet Research |volume=20 |issue=5 |pages=e187 |doi=10.2196/jmir.9330 |pmc=5974463 |pmid=29764795 |doi-access=free }} The collective intelligence outcomes from crowdsourcing are being generated in three broad categories of public health care: health promotion, health research,{{cite journal |author=van der Krieke |display-authors=et al |year=2015 |title=HowNutsAreTheDutch (HoeGekIsNL): A crowdsourcing study of mental symptoms and strengths |url=https://pure.rug.nl/ws/files/30435764/2015_Van_der_Krieke_Jeronimus_HowNutsAreTheDutch_A_Crowdsourcing_Study_of_Mental_Symptoms_and_Strengths.pdf |journal=International Journal of Methods in Psychiatric Research |volume=25 |issue=2 |pages=123–144 |doi=10.1002/mpr.1495 |pmc=6877205 |pmid=26395198 |access-date=26 December 2018 |archive-date=2 August 2019 |archive-url=https://web.archive.org/web/20190802163143/https://pure.rug.nl/ws/files/30435764/2015_Van_der_Krieke_Jeronimus_HowNutsAreTheDutch_A_Crowdsourcing_Study_of_Mental_Symptoms_and_Strengths.pdf |url-status=dead }} and health maintenance.{{Cite book |author=Prpić, J. |date=2015 |title=Health Care Crowds: Collective Intelligence in Public Health. Collective Intelligence 2015. Center for the Study of Complex Systems, University of Michigan. |publisher=Papers.ssrn.com |ssrn=2570593}} Crowdsourcing also enables researchers to move from small homogeneous groups of participants to large heterogenous groups{{cite journal |last1=van der Krieke |first1=L |last2=Blaauw |first2=FJ |last3=Emerencia |first3=AC |last4=Schenk |first4=HM |last5=Slaets |first5=JP |last6=Bos |first6=EH |last7=de Jonge |first7=P |last8=Jeronimus |first8=BF |year=2016 |title=Temporal Dynamics of Health and Well-Being: A Crowdsourcing Approach to Momentary Assessments and Automated Generation of Personalized Feedback (2016) |journal=Psychosomatic Medicine |volume=79 |issue=2 |pages=213–223 |doi=10.1097/PSY.0000000000000378 |pmid=27551988 |s2cid=10955232|url=https://pure.rug.nl/ws/files/40193705/00006842_201702000_00011.pdf }} beyond convenience samples such as students or higher educated people. The SESH group focuses on using crowdsourcing to improve health.
Methods
Internet and digital technologies have massively expanded the opportunities for crowdsourcing. However, the effect of user communication and platform presentation can have a major bearing on the success of an online crowdsourcing project. The crowdsourced problem can range from huge tasks (such as finding alien life or mapping earthquake zones) or very small (identifying images). Some examples of successful crowdsourcing themes are problems that bug people, things that make people feel good about themselves, projects that tap into niche knowledge of proud experts, and subjects that people find sympathetic.Ess, Henk van (2010) [http://www.slideshare.net/searchbistro/harvesting-knowledge-how-to-crowdsource-in-2010 "Crowdsourcing: how to find a crowd"], ARD ZDF Akademie, Berlin, p. 99
Crowdsourcing can either take an explicit or an implicit route:
- Explicit crowdsourcing lets users work together to evaluate, share, and build different specific tasks, while implicit crowdsourcing means that users solve a problem as a side effect of something else they are doing. With explicit crowdsourcing, users can evaluate particular items like books or webpages, or share by posting products or items. Users can also build artifacts by providing information and editing other people's work.{{Citation needed|date=September 2022}}
- Implicit crowdsourcing can take two forms: standalone and piggyback. Standalone allows people to solve problems as a side effect of the task they are actually doing, whereas piggyback takes users' information from a third-party website to gather information.{{Citation|last1=Doan|first1=A.|title=Crowdsourcing Systems on the World Wide Web|url=https://cacm.acm.org/magazines/2011/4/106563-crowdsourcing-systems-on-the-world-wide-web/fulltext|journal=Communications of the ACM|volume=54|issue=4|pages=86–96|year=2011|format=PDF|doi=10.1145/1924421.1924442|last2=Ramarkrishnan|first2=R.|last3=Halevy|first3=A.|s2cid=207184672|url-access=subscription}} This is also known as data donation.
In his 2013 book, Crowdsourcing, Daren C. Brabham puts forth a problem-based typology of crowdsourcing approaches:{{Citation|title=Crowdsourcing|year=2013|last1=Brabham|first1=Daren C.|pages=45|publisher=MIT Press}}
- Knowledge discovery and management is used for information management problems where an organization mobilizes a crowd to find and assemble information. It is ideal for creating collective resources.
- Distributed human intelligence tasking (HIT) is used for information management problems where an organization has a set of information in hand and mobilizes a crowd to process or analyze the information. It is ideal for processing large data sets that computers cannot easily do. Amazon Mechanical Turk uses this approach.
- Broadcast search is used for ideation problems where an organization mobilizes a crowd to come up with a solution to a problem that has an objective, provable right answer. It is ideal for scientific problem-solving.
- Peer-vetted creative production is used for ideation problems, where an organization mobilizes a crowd to come up with a solution to a problem which has an answer that is subjective or dependent on public support. It is ideal for design, aesthetic, or policy problems.
Ivo Blohm identifies four types of Crowdsourcing Platforms: Microtasking, Information Pooling, Broadcast Search, and Open Collaboration. They differ in the diversity and aggregation of contributions that are created. The diversity of information collected can either be homogenous or heterogenous. The aggregation of information can either be selective or integrative.{{Definition needed|date=September 2022}}{{cite journal|title= How to Manage Crowdsourcing Platforms Effectively|journal= California Management Review|date= 2018|volume= 60|issue= 2|pages= 122–149|doi= 10.1177/0008125617738255|s2cid= 73551209|url= https://www.alexandria.unisg.ch/252464/1/BlohmEtAl_2018_HowToManageCrowdsourcingIntermediaries.pdf|last1= Blohm|first1= Ivo|last2= Zogaj|first2= Shkodran|last3= Bretschneider|first3= Ulrich|last4= Leimeister|first4= Jan Marco|access-date= 24 August 2020|archive-date= 20 July 2018|archive-url= https://web.archive.org/web/20180720145920/https://www.alexandria.unisg.ch/252464/1/BlohmEtAl_2018_HowToManageCrowdsourcingIntermediaries.pdf|url-status= dead}} Some common categories of crowdsourcing have been used effectively in the commercial world include crowdvoting, crowdsolving, crowdfunding, microwork, creative crowdsourcing, crowdsource workforce management, and inducement prize contests.{{Citation | last = Howe | first = Jeff | title = Crowdsourcing: Why the Power of the Crowd is Driving the Future of Business | publisher= The International Achievement Institute | year = 2008 | url = http://www.bizbriefings.com/Samples/IntInst%20---%20Crowdsourcing.PDF | access-date = 2012-04-09 | archive-url = https://web.archive.org/web/20150923191141/http://www.bizbriefings.com/Samples/IntInst%20---%20Crowdsourcing.PDF | archive-date = 2015-09-23 | url-status = dead }}
In their conceptual review of the crowdsourcing, Linus Dahlander, Lars Bo Jeppesen, and Henning Piezunka distinguish four steps in the crowdsourcing process: Define, Broadcast, Attract, and Select.{{Citation |last1=Dahlander |first1=Linus |title=How Organizations Manage Crowds: Define, Broadcast, Attract, and Select |date=2019-01-01 |work=Managing Inter-organizational Collaborations: Process Views |volume=64 |pages=239–270 |editor-last=Sydow |editor-first=Jörg |url=https://www.emerald.com/insight/content/doi/10.1108/s0733-558x20190000064016/full/html |access-date=2025-04-19 |series=Research in the Sociology of Organizations |publisher=Emerald Publishing Limited |doi=10.1108/s0733-558x20190000064016 |isbn=978-1-78756-592-0 |last2=Jeppesen |first2=Lars Bo |last3=Piezunka |first3=Henning |editor2-last=Berends |editor2-first=Hans|url-access=subscription }}
= Crowdvoting =
Crowdvoting occurs when a website gathers a large group's opinions and judgments on a certain topic. Some crowdsourcing tools and platforms allow participants to rank each other's contributions, e.g. in answer to the question "What is one thing we can do to make Acme a great company?" One common method for ranking is "like" counting, where the contribution with the most "like" votes ranks first. This method is simple and easy to understand, but it privileges early contributions, which have more time to accumulate votes.{{Citation needed|date=September 2022}} In recent years, several crowdsourcing companies have begun to use pairwise comparisons backed by ranking algorithms. Ranking algorithms do not penalize late contributions.{{Citation needed|date=September 2022}} They also produce results quicker. Ranking algorithms have proven to be at least 10 times faster than manual stack ranking.{{cite web |author= |date=25 May 2017 |title=Crowdvoting: How Elo Limits Disruption |url=https://thevisionlab.com/crowdsourcing/crowdvoting-elo |website=thevisionlab.com}} One drawback, however, is that ranking algorithms are more difficult to understand than vote counting.
The Iowa Electronic Market is a prediction market that gathers crowds' views on politics and tries to ensure accuracy by having participants pay money to buy and sell contracts based on political outcomes.{{cite news |title=IEM Demonstrates the Political Wisdom of Crowds |first=John |last=Robson |url=http://tippie.uiowa.edu/iem/media/story.cfm?id=2793 |newspaper=Canoe.ca |date=24 February 2012 |access-date=31 March 2012 |archive-url=https://web.archive.org/web/20120407121438/http://tippie.uiowa.edu/iem/media/story.cfm?ID=2793 |archive-date=2012-04-07 |url-status=dead }} Some of the most famous examples have made use of social media channels: Domino's Pizza, Coca-Cola, Heineken, and Sam Adams have crowdsourced a new pizza, bottle design, beer, and song respectively.{{cite web|url= http://www.digitalagencymarketing.com/2012/03/4-great-examples-of-social-crowdsourcing/|title= 4 Great Examples of Crowdsourcing through Social Media|publisher= digitalagencymarketing.com|year= 2012|access-date= 2012-03-29|archive-url= https://web.archive.org/web/20120401224920/http://www.digitalagencymarketing.com/2012/03/4-great-examples-of-social-crowdsourcing/|archive-date= 2012-04-01|url-status= dead}} A website called Threadless selected the T-shirts it sold by having users provide designs and vote on the ones they like, which are then printed and available for purchase.
{{Citation
|last=Brabham
|first=Daren
|title=Crowdsourcing as a Model for Problem Solving: An Introduction and Cases
|journal=Convergence: The International Journal of Research into New Media Technologies
|volume=14
|issue=1
|year=2008
|pages=75–90
|url=http://www.clickadvisor.com/downloads/Brabham_Crowdsourcing_Problem_Solving.pdf
|archive-url=https://web.archive.org/web/20120802162119/http://www.clickadvisor.com/downloads/Brabham_Crowdsourcing_Problem_Solving.pdf
|archive-date=2012-08-02
|doi=10.1177/1354856507084420
|url-status=dead
|citeseerx=10.1.1.175.1623
|s2cid=145310730
}}
The California Report Card (CRC), a program jointly launched in January 2014 by the Center for Information Technology Research in the Interest of Society{{cite web|last1=Goldberg|first1=Ken|last2=Newsom|first2=Gavin|title=Let's amplify California's collective intelligence|url=http://citris-uc.org/lets-amplify-californias-collective-intelligence-op-ed-ken-goldberg-gavin-newsom-california-report-card/|website=Citris-uc.org|access-date=14 June 2014|date=2014-06-12}} and Lt. Governor Gavin Newsom, is an example of modern-day crowd voting. Participants access the CRC online and vote on six timely issues. Through principal component analysis, the users are then placed into an online "café" in which they can present their own political opinions and grade the suggestions of other participants. This system aims to effectively involve the greater public in relevant political discussions and highlight the specific topics with which people are most concerned.
Crowdvoting's value in the movie industry was shown when in 2009 a crowd accurately predicted the success or failure of a movie based on its trailer,Escoffier, N. and B. McKelvey (2014). "Using "Crowd-Wisdom Strategy" to Co-Create Market Value: Proof-of-Concept from the Movie Industry." in International Perspective on Business Innovation and Disruption in the Creative Industries: Film, Video, Photography, P. Wikstrom and R. DeFillippi, eds., UK: Edward Elgar Publishing Ltd, Chap. 11. {{ISBN| 9781783475339}}{{cite news |url=https://www.hollywoodreporter.com/news/how-boxoffice-trading-could-flop-22886 |last=Block |first= A. B. |title=How boxoffice trading could flop. |work=The Hollywood Reporter |date=21 April 2010 }} a feat that was replicated in 2013 by Google.Chen, A. and Panaligan, R. (2013). [https://adwords.googleblog.com/2013/06/quantifying-movie-magic-with-google.html "Quantifying movie magic with Google search."] Google White Paper, Industry Perspectives+User Insights
On Reddit, users collectively rate web content, discussions and comments as well as questions posed to persons of interest in "AMA" and AskScience online interviews.{{Cleanup inline|reason=Reddit use upvoting and downvoting everywhere, it shouldn't just specify AMAs|date=September 2022}}
In 2017, Project Fanchise purchased a team in the Indoor Football League and created the Salt Lake Screaming Eagles, a fan run team. Using a mobile app, the fans voted on the day-to-day operations of the team, the mascot name, signing of players and even offensive play calling during games.{{Cite news|url=https://www.nytimes.com/2017/02/17/sports/football/indoor-football-league-screaming-eagles.html|title=An Indoor Football Team Has Its Fans Call the Plays|last=Williams|first=Jack|date=2017-02-17|work=The New York Times|access-date=2018-02-07|issn=0362-4331}}
=Crowdfunding=
{{Main|Crowdfunding}}
Crowdfunding is the process of funding projects by a multitude of people contributing a small amount to attain a certain monetary goal, typically via the Internet.{{cite news|url=https://www.forbes.com/sites/tanyaprive/2012/11/27/what-is-crowdfunding-and-how-does-it-benefit-the-economy/|title=What Is Crowdfunding And How Does It Benefit The Economy |work=Forbes.com|access-date=2015-07-02|first=Tanya|last=Prive}} Crowdfunding has been used for both commercial and charitable purposes.{{Citation | last1 = Choy | first1 = Katherine | last2 = Schlagwein| first2 = Daniel | title = Crowdsourcing for a better world: On the relation between IT affordances and donor motivations in charitable crowdfunding | journal = Information Technology & People | volume = 29 | issue = 1 | year = 2016 | doi=10.1108/ITP-09-2014-0215 | pages=221–247| s2cid = 12352130 | url = https://unsworks.unsw.edu.au/bitstreams/c64b500c-b9a6-4ad8-a955-569cb9325363/download | hdl = 1959.4/unsworks_38196 | hdl-access = free }} The crowdfuding model that has been around the longest is rewards-based crowdfunding. This model is where people can prepurchase products, buy experiences, or simply donate. While this funding may in some cases go towards helping a business, funders are not allowed to invest and become shareholders via rewards-based crowdfunding.{{cite news|url=https://www.forbes.com/sites/chancebarnett/2014/08/29/crowdfunding-sites-in-2014/|title=Crowdfunding Sites In 2014 |work=Forbes.com|access-date=2015-07-02|first=Chance|last=Barnett}}
Individuals, businesses, and entrepreneurs can showcase their businesses and projects by creating a profile, which typically includes a short video introducing their project, a list of rewards per donation, and illustrations through images.{{Citation needed|date=September 2022}} Funders make monetary contribution for numerous reasons:
- They connect to the greater purpose of the campaign, such as being a part of an entrepreneurial community and supporting an innovative idea or product.{{cite journal | last1=Agrawal | first1=Ajay | last2=Catalini | first2=Christian | last3=Goldfarb | first3=Avi | title=Some Simple Economics of Crowdfunding | journal=Innovation Policy and the Economy | publisher=University of Chicago Press | volume=14 | year=2014 | issn=1531-3468 | doi=10.1086/674021 | pages=63–97| hdl=1721.1/108043 | s2cid=16085029 |url=https://www.nber.org/system/files/working_papers/w19133/w19133.pdf}}
- They connect to a physical aspect of the campaign like rewards and gains from investment.
- They connect to the creative display of the campaign's presentation.
- They want to see new products before the public.
The dilemma for equity crowdfunding in the US as of 2012 was during a refinement process for the regulations of the Securities and Exchange Commission, which had until 1 January 2013 to tweak the fundraising methods. The regulators were overwhelmed trying to regulate Dodd-Frank and all the other rules and regulations involving public companies and the way they traded. Advocates of regulation claimed that crowdfunding would open up the flood gates for fraud, called it the "wild west" of fundraising, and compared it to the 1980s days of penny stock "cold-call cowboys". The process allowed for up to $1 million to be raised without some of the regulations being involved. Companies under the then-current proposal would have exemptions available and be able to raise capital from a larger pool of persons, which can include lower thresholds for investor criteria, whereas the old rules required that the person be an "accredited" investor. These people are often recruited from social networks, where the funds can be acquired from an equity purchase, loan, donation, or ordering. The amounts collected have become quite high, with requests that are over a million dollars for software such as Trampoline Systems, which used it to finance the commercialization of their new software.{{Citation needed|date=September 2022}}
=Inducement prize contests=
Web-based idea competitions or inducement prize contests often consist of generic ideas, cash prizes, and an Internet-based platform to facilitate easy idea generation and discussion. An example of these competitions includes an event like IBM's 2006 "Innovation Jam", attended by over 140,000 international participants and yielded around 46,000 ideas.
{{Citation | last1=Leimeister | first1=J.M. | last2 = Huber | first2 = M. | last3 = Bretschneider | first3 = U. | last4 = Krcmar | first4 = H. | title = Leveraging Crowdsourcing: Activation-Supporting Components for IT-Based Ideas Competition | journal = Journal of Management Information Systems | volume = 26 | issue = 1 | pages = 197–224 | year = 2009 | url = http://portal.acm.org/citation.cfm?id=1653890 | doi=10.2753/mis0742-1222260108| s2cid=17485373 | url-access = subscription }}{{cite journal| title=Community Engineering for Innovations: The Ideas Competition as a Method to Nurture a Virtual Community for Innovations| last1=Ebner| first1=W. |last2 = Leimeister| first2 = J. | last3 = Krcmar| first3 = H.| url=https://www.researchgate.net/publication/227500941| journal=R&D Management| volume=39| issue=4| pages=342–356| date=September 2009| access-date=20 January 2024| doi=10.1111/j.1467-9310.2009.00564.x}} Another example is the Netflix Prize in 2009. People were asked to come up with a recommendation algorithm that is more accurate than Netflix's current algorithm. It had a grand prize of US$1,000,000, and it was given to a team which designed an algorithm that beat Netflix's own algorithm for predicting ratings by 10.06%.{{Citation needed|date=September 2022}}
Another example of competition-based crowdsourcing is the 2009 DARPA balloon experiment, where DARPA placed 10 balloon markers across the United States and challenged teams to compete to be the first to report the location of all the balloons. A collaboration of efforts was required to complete the challenge quickly and in addition to the competitive motivation of the contest as a whole, the winning team (MIT, in less than nine hours) established its own "collaborapetitive" environment to generate participation in their team.{{cite web|url=https://networkchallenge.darpa.mil/default.aspx |title=DARPA Network Challenge |publisher=DARPA Network Challenge |access-date=28 November 2011 |url-status=dead |archive-url=https://web.archive.org/web/20110811233340/https://networkchallenge.darpa.mil/Default.aspx |archive-date=11 August 2011 }} A similar challenge was the Tag Challenge, funded by the US State Department, which required locating and photographing individuals in five cities in the US and Europe within 12 hours based only on a single photograph. The winning team managed to locate three suspects by mobilizing volunteers worldwide using a similar incentive scheme to the one used in the balloon challenge.{{cite magazine|url=https://www.newscientist.com/article/dn21666-social-media-web-snares-criminals.html |title=Social media web snares 'criminals' |magazine=New Scientist |access-date=4 April 2012}}
Using open innovation platforms is an effective way to crowdsource people's thoughts and ideas for research and development. The company InnoCentive is a crowdsourcing platform for corporate research and development where difficult scientific problems are posted for crowds of solvers to discover the answer and win a cash prize that ranges from $10,000 to $100,000 per challenge. InnoCentive, of Waltham, Massachusetts, and London, England, provides access to millions of scientific and technical experts from around the world. The company claims a success rate of 50% in providing successful solutions to previously unsolved scientific and technical problems. The X Prize Foundation creates and runs incentive competitions offering between $1 million and $30 million for solving challenges. Local Motors is another example of crowdsourcing, and it is a community of 20,000 automotive engineers, designers, and enthusiasts that compete to build off-road rally trucks.{{cite web|url=http://www.fourhourworkweek.com/blog/2012/02/20/beyond-x-prize-the-10-best-crowdsourcing-tools-and-technologies/ |title=Beyond XPrize: The 10 Best Crowdsourcing Tools and Technologies |date = 20 February 2012 |access-date=30 March 2012}}
=Implicit crowdsourcing=
Implicit crowdsourcing is less obvious because users do not necessarily know they are contributing, yet can still be very effective in completing certain tasks.{{Citation needed|date=September 2022}} Rather than users actively participating in solving a problem or providing information, implicit crowdsourcing involves users doing another task entirely where a third party gains information for another topic based on the user's actions.
A good example of implicit crowdsourcing is the ESP game, where users find words to describe Google images, which are then used as metadata for the images. Another popular use of implicit crowdsourcing is through reCAPTCHA, which asks people to solve CAPTCHAs to prove they are human, and then provides CAPTCHAs from old books that cannot be deciphered by computers, to digitize them for the web. Like many tasks solved using the Mechanical Turk, CAPTCHAs are simple for humans, but often very difficult for computers.
Piggyback crowdsourcing can be seen most frequently by websites such as Google that data-mine a user's search history and websites to discover keywords for ads, spelling corrections, and finding synonyms. In this way, users are unintentionally helping to modify existing systems, such as Google Ads.
{{Citation | last1 = Kittur | first1 =A. | last2 = Chi | first2 = E.H. | last3 = Sun | first3 = B. | title = Crowdsourcing user studies with Mechanical Turk | journal = Chi 2008 | year = 2008 | url = http://www-users.cs.umn.edu/~echi/papers/2008-CHI2008/2008-02-mech-turk-online-experiments-chi1049-kittur.pdf }}
= Other types =
- Creative crowdsourcing involves sourcing people for creative projects such as graphic design, crowdsourcing architecture, product design, apparel design, movies,Cunard, C. (19 July 2010). [https://dailybruin.com/2010/07/19/the_movie_research_experience_gets_audiences_involved_in_filmmaking "The Movie Research Experience gets audiences involved in filmmaking."] The Daily Bruin writing, company naming,{{Cite news|last=MacArthur|first=Kate|title=Squadhelp wants your company to crowdsource better names (and avoid Boaty McBoatface)|work=chicagotribune.com|url=http://www.chicagotribune.com/bluesky/originals/ct-squadhelp-startup-names-bsi-20170331-story.html|access-date=2017-08-28}} illustration, etc.{{cite web|date=4 June 2013|title=Compete To Create Your Dream Home|work=Co.Exist |url=http://www.fastcoexist.com/1682162/a-site-that-lets-designers-compete-to-create-your-dream-home|access-date=2014-02-03|publisher=FastCoexist.com}}{{cite news|date=11 June 2012|title=Designers, clients forge ties on web|newspaper=Boston Herald|url=http://bostonherald.com/business/technology/technology_news/2012/06/designers_clients_forge_ties_web|access-date=2014-02-03}} While crowdsourcing competitions have been used for decades in some creative fields such as architecture, creative crowdsourcing has proliferated with the recent development of web-based platforms where clients can solicit a wide variety of creative work at lower cost than by traditional means.{{Citation needed|date=September 2022}}
- Crowdshipping (crowd-shipping) is a peer-to-peer shipping service, usually conducted via an online platform or marketplace.{{Citation|last=Dolan|first=Shelagh|title=Crowdsourced delivery explained: making same day shipping cheaper through local couriers.|url=https://www.businessinsider.com/crowdsourced-delivery-shipping-explained|journal=Business Insider|archive-url=https://web.archive.org/web/20180522060126/http://www.businessinsider.com/crowdsourced-delivery-shipping-explained|access-date=21 May 2018|archive-date=22 May 2018|url-status=dead}} There are several methods that have been categorized as crowd-shipping:
- Travelers heading in the direction of the buyer, and are willing to bring the package as part of their luggage for a reward.{{Citation|last=Murison|first=Malek|title=LivingPackets uses IoT, crowdshipping to transform deliveries|date=19 April 2018|url=https://internetofbusiness.com/livingpackets-iot-international-deliveries/|journal=Internet of Business|access-date=19 April 2018}}
- Truck drivers whose route lies along the buyer's location and who are willing to take extra items in their truck.{{Citation|last1=Biller|first1=David|title=Goldman Sachs, Soros Bet on the Uber of Brazilian Trucking|url=https://www.bloomberg.com/news/articles/2018-06-19/goldman-sachs-soros-bet-on-the-uber-of-brazilian-trucking|journal=Bloomberg|access-date=11 March 2019|last2=Sciaudone|first2=Christina|date=19 June 2018 }}
- Community-based platforms that connect international buyers and local forwarders, by allowing buyers to use forwarder's address as purchase destination, after which forwarders ship items further to the buyer.{{Citation|last=Tyrsina|first=Radu|title=Parcl Uses Trusted Forwarders to Bring you Products that don't Ship to your Country|url=https://techpp.com/2015/10/01/parcl-buy-products-that-dont-ship-to-your-country/|journal=Technology Personalised|archive-url=https://web.archive.org/web/20151003234051/http://techpp.com/2015/10/01/parcl-buy-products-that-dont-ship-to-your-country/|access-date=1 October 2015|archive-date=3 October 2015|url-status=dead}}
- Crowdsolving is a collaborative and holistic way of solving a problem through many people, communities, groups, or resources. It is a type of crowdsourcing with focus on complex and intellectually demanding problems requiring considerable effort, and the quality or uniqueness of contribution.Geiger D, Rosemann M, Fielt E. (2011) [https://aisel.aisnet.org/acis2011/33/ Crowdsourcing information systems: a systems theory perspective]. Proceedings of the 22nd Australasian Conference on Information Systems.
- Problem–idea chains are a form of idea crowdsourcing and crowdsolving, where individuals are asked to submit ideas to solve problems and then problems that can be solved with those ideas. The aim is to find encourage individuals to find practical solutions to problems that are well thought through.{{Cite journal|last=Powell|first=D|date=2015|title=A new tool for crowdsourcing|url=https://cyberleninka.ru/article/n/a-new-tool-for-crowdsourcing|journal=МИР (Модернизация. Инновации. Развитие)|volume=6|issue=2-2 (22)|issn=2079-4665}}
- Macrowork tasks typically have these characteristics: they can be done independently, they take a fixed amount of time, and they require special skills. Macro-tasks could be part of specialized projects or could be part of a large, visible project where workers pitch in wherever they have the required skills. The key distinguishing factors are that macro-work requires specialized skills and typically takes longer, while microwork requires no specialized skills.
- Microwork is a crowdsourcing platform that allows users to do small tasks for which computers lack aptitude in for low amounts of money. Amazon's Mechanical Turk has created many different projects for users to participate in, where each task requires very little time and offers a very small amount in payment. When choosing tasks, since only certain users "win", users learn to submit later and pick less popular tasks to increase the likelihood of getting their work chosen.{{Citation|last1=Yang|first1=J.|url=http://www-personal.umich.edu/~ladamic/papers/taskcn/EC2008Witkey.pdf|year=2008|doi=10.1145/1386790.1386829|last2=Adamic|first2=L.|last3=Ackerman|first3=M.|title=Proceedings of the 9th ACM conference on Electronic commerce |chapter=Crowdsourcing and knowledge sharing: Strategic user behavior on taskcn |pages=246–255 |isbn=9781605581699 |s2cid=15553154|access-date=28 February 2012|archive-date=29 July 2020|archive-url=https://web.archive.org/web/20200729004950/http://www-personal.umich.edu/~ladamic/papers/taskcn/EC2008Witkey.pdf|url-status=dead}} An example of a Mechanical Turk project is when users searched satellite images for a boat to find Jim Gray, a missing computer scientist.
- Mobile crowdsourcing involves activities that take place on smartphones or mobile platforms that are frequently characterized by GPS technology.{{cite web|title=Mobile Crowdsourcing|url=http://www.clickworker.com/en/crowdsourcing-glossar/mobile-crowdsourcing/|access-date=10 December 2014|website=Clickworker}} This allows for real-time data gathering and gives projects greater reach and accessibility. However, mobile crowdsourcing can lead to an urban bias, and can have safety and privacy concerns.{{cite conference | last1=Thebault-Spieker | first1=Jacob | last2=Terveen | first2=Loren G. | last3=Hecht | first3=Brent | title=Proceedings of the 18th ACM Conference on Computer Supported Cooperative Work & Social Computing | chapter=Avoiding the South Side and the Suburbs | publisher=ACM | publication-place=New York, NY, USA | date=2015-02-28 | pages=265–275 | doi=10.1145/2675133.2675278 | isbn=9781450329224 }}{{cite book|author1=Chatzimiloudis, Konstantinidis|author2=Laoudias, Zeinalipour-Yazti|name-list-style=amp|title=Crowdsourcing with smartphones|url=http://www.cs.ucy.ac.cy/~dzeina/papers/ic12-crowdsourcing.pdf}}{{Cite journal|last1=Arkian|first1=Hamid Reza|last2=Diyanat|first2=Abolfazl|last3=Pourkhalili|first3=Atefe|year=2017|title=MIST: Fog-based data analytics scheme with cost-efficient resource provisioning for IoT crowdsensing applications|journal=Journal of Network and Computer Applications|volume=82|pages=152–165|doi=10.1016/j.jnca.2017.01.012}}
- Simple projects are those that require a large amount of time and skills compared to micro and macro-work. While an example of macro-work would be writing survey feedback, simple projects rather include activities like writing a basic line of code or programming a database, which both require a larger time commitment and skill level. These projects are usually not found on sites like Amazon Mechanical Turk, and are rather posted on platforms like Upwork that call for a specific expertise.{{Cite journal|last=Felstiner|first=Alek|date=August 2011|title=Working the Crowd: Employment and Labor Law in the Crowdsourcing Industry|url=http://wtf.tw/ref/felstiner.pdf|journal=Berkeley Journal of Employment & Labor Law|volume=32|pages=150–151|via=WTF}}
- Complex projects generally take the most time, have higher stakes, and call for people with very specific skills. These are generally "one-off" projects that are difficult to accomplish and can include projects such as designing a new product that a company hopes to patent. Such projects are considered to be complex because design is a meticulous process that requires a large amount of time to perfect, and people completing the project must have specialized training in design to effectively complete the project. These projects usually pay the highest, yet are rarely offered.{{Cite web|title=View of Crowdsourcing: Libertarian Panacea or Regulatory Nightmare?|url=https://online-shc.com/arc/ojs/index.php/JOHE/article/view/4/4|access-date=2017-05-26|website=online-shc.com}}{{Dead link|date=November 2019|bot=InternetArchiveBot|fix-attempted=yes}}
- Crowdsourcing-Based Optimization refers to a class of methods that utilize crowdsourcing to enable a group of workers to collaboratively collect data and solve optimization problems related to the data. Due to the heterogeneity of workers, the data collected varies, and the workers' understanding of the optimization problems also differs, thus posing challenges to collaboratively solve a global optimization problem. Representative methods for solving Crowdsourcing-Based Optimization include CrowdEC, which is a mechanism that dispatch the optimization tasks to a group of workers that collaborate to perform evolutionary computation (EC) in a distributed manner.{{Cite journal|last1=WEI|first1=F-F.|last2=CHEN|first2=W-N.|last3=Guo|first3=X-Q.|last4=Zhao|first4=B.|last5=Jeon|first5=S-W.|last6=Zhang|first6=J.|date=2024|title=CrowdEC: Crowdsourcing-based Evolutionary Computation for Distributed Optimization|url=https://ieeexplore.ieee.org/document/10618890|journal= IEEE Transactions on Services Computing|volume=17|issue=6 |pages=3286–3299|doi=10.1109/TSC.2024.3433487 |url-access=subscription}}
Demographics of the crowd
The crowd is an umbrella term for the people who contribute to crowdsourcing efforts. Though it is sometimes difficult to gather data about the demographics of the crowd as a whole, several studies have examined various specific online platforms. Amazon Mechanical Turk has received a great deal of attention in particular. A study in 2008 by Ipeirotis found that users at that time were primarily American, young, female, and well-educated, with 40% earning more than $40,000 per year. In November 2009, Ross found a very different Mechanical Turk population where 36% of which was Indian. Two-thirds of Indian workers were male, and 66% had at least a bachelor's degree. Two-thirds had annual incomes less than $10,000, with 27% sometimes or always depending on income from Mechanical Turk to make ends meet.{{cite journal |last1=Ross |first1=J. |last2=Irani |first2=L. |last3=Silberman |first3=M.S. |last4=Zaldivar |first4=A. |last5=Tomlinson |first5=B. |title=Who are the Crowdworkers? Shifting Demographics in Mechanical Turk |journal=Chi 2010 |year=2010 |url=http://www.ics.uci.edu/~jwross/pubs/RossEtAl-WhoAreTheCrowdworkers-altCHI2010.pdf |archive-url=https://wayback.archive-it.org/all/20110401101755/http://www.ics.uci.edu/~jwross/pubs/RossEtAl-WhoAreTheCrowdworkers-altCHI2010.pdf |url-status=dead |archive-date=2011-04-01 |access-date=2012-02-28 }} More recent studies have found that U.S. Mechanical Turk workers are approximately 58% female, and nearly 67% of workers are in their 20s and 30s.{{Cite journal |last1=Huff |first1=Connor |last2=Tingley |first2=Dustin |date=2015-07-01 |title="Who are these people?" Evaluating the demographic characteristics and political preferences of MTurk survey respondents |journal=Research & Politics |volume=2 |issue=3 |pages=205316801560464 |doi=10.1177/2053168015604648 |s2cid=7749084 |doi-access=free }}{{Cite journal |last1=Levay |first1=Kevin E. |last2=Freese |first2=Jeremy |last3=Druckman |first3=James N. |date=2016-01-01 |title=The Demographic and Political Composition of Mechanical Turk Samples |journal=SAGE Open |volume=6 |issue=1 |pages=215824401663643 |doi=10.1177/2158244016636433 |s2cid=147299692 |doi-access=free }} Close to 80% are White, and 9% are Black. MTurk workers are less likely to be married or have children as compared to the general population. In the US population over 18, 45% are unmarried, while the proportion of unmarried workers on MTurk is around 57%. Additionally, about 55% of MTurk workers do not have any children, which is significantly higher than the general population. Approximately 68% of U.S. workers are employed, compared to 60% in the general population. MTurk workers in the U.S. are also more likely to have a four-year college degree (35%) compared to the general population (27%). Politics within the U.S. sample of MTurk are skewed liberal, with 46% Democrats, 28% Republicans, and 26% "other". MTurk workers are also less religious than the U.S. population, with 41% religious, 20% spiritual, 21% agnostic, and 16% atheist.
The demographics of Microworkers.com differ from Mechanical Turk in that the US and India together accounting for only 25% of workers; 197 countries are represented among users, with Indonesia (18%) and Bangladesh (17%) contributing the largest share. However, 28% of employers are from the US.{{Citation | last1 = Hirth | first1 = M. | last2 = Hoßfeld | first2 =T. | last3 = Train-Gia | first3 = P. | title = Human Cloud as Emerging Internet Application – Anatomy of the Microworkers Crowdsourcing Platform | year = 2011 | url = http://www3.informatik.uni-wuerzburg.de/TR/tr478.pdf}}
Another study of the demographics of the crowd at iStockphoto found a crowd that was largely white, middle- to upper-class, higher educated, worked in a so-called "white-collar job" and had a high-speed Internet connection at home.{{cite journal | last = Brabham | first = Daren C. | title = Moving the Crowd at iStockphoto: The Composition of the Crowd and Motivations for Participation in a Crowdsourcing Application | journal = First Monday | volume = 13 | issue = 6 | year = 2008 | doi = 10.5210/fm.v13i6.2159 | doi-access = free }} In a crowd-sourcing diary study of 30 days in Europe, the participants were predominantly higher educated women.
Studies have also found that crowds are not simply collections of amateurs or hobbyists. Rather, crowds are often professionally trained in a discipline relevant to a given crowdsourcing task and sometimes hold advanced degrees and many years of experience in the profession.{{cite journal | last = Brabham | first = Daren C. | title = Managing Unexpected Publics Online: The Challenge of Targeting Specific Groups with the Wide-Reaching Tool of the Internet | journal = International Journal of Communication | year = 2012
| volume = 6 | page = 20 | url = http://ijoc.org/ojs/index.php/ijoc/article/view/1542/751}}{{cite journal | last = Brabham | first = Daren C. | title = Moving the Crowd at Threadless: Motivations for Participation in a Crowdsourcing Application | journal = Information, Communication & Society | year = 2010
| doi=10.1080/13691181003624090 | volume=13 | issue = 8 | pages=1122–1145| s2cid = 143402410 }} Claiming that crowds are amateurs, rather than professionals, is both factually untrue and may lead to marginalization of crowd labor rights.{{cite journal | last = Brabham | first = Daren C. | title = The Myth of Amateur Crowds: A Critical Discourse Analysis of Crowdsourcing Coverage | journal = Information, Communication & Society | year = 2012
| doi=10.1080/1369118X.2011.641991 | volume=15 | issue = 3 | pages=394–410| s2cid = 145675154 }}
Gregory Saxton et al. studied the role of community users, among other elements, during his content analysis of 103 crowdsourcing organizations. They developed a taxonomy of nine crowdsourcing models (intermediary model, citizen media production, collaborative software development, digital goods sales, product design, peer-to-peer social financing, consumer report model, knowledge base building model, and collaborative science project model) in which to categorize the roles of community users, such as researcher, engineer, programmer, journalist, graphic designer, etc., and the products and services developed.{{cite journal| title = Rules of Crowdsourcing: Models, Issues, and Systems of Control| year = 2013 | doi=10.1080/10580530.2013.739883 | volume=30 | journal=Information Systems Management | pages=2–20| citeseerx = 10.1.1.300.8026 | s2cid = 16811686 | last1 = Saxton | first1 = Gregory D. | last2 = Oh | first2 = Onook | last3 = Kishore | first3 = Rajiv }}
=Motivations=
{{Further|Online participation#Motivations}}
==Contributors==
Many researchers suggest that both intrinsic and extrinsic motivations cause people to contribute to crowdsourced tasks and these factors influence different types of contributors.{{Cite journal |last=Aitamurto |first=Tanja |year=2015 |title=Motivation Factors in Crowdsourced Journalism: Social Impact, Social Change, and Peer Learning |url=http://crowdsourcinginjournalism.com/2015/10/28/motivation-factors-in-crowdsourced-journalism-social-impact-social-change-and-peer-learning/ |journal=International Journal of Communication |volume=9 |pages=3523–3543}}{{cite journal | last1 = Kaufmann | first1 = N. | last2 = Schulze | first2 = T. | last3 = Viet | first3 = D. | title = More than fun and money. Worker Motivation in Crowdsourcing – A Study on Mechanical Turk | year = 2011 | journal = Proceedings of the Seventeenth Americas Conference on Information Systems | url = http://schader.bwl.uni-mannheim.de/fileadmin/files/publikationen/Kaufmann_Schulze_Veit_2011_-_More_than_fun_and_money_Worker_motivation_in_Crowdsourcing_-_A_Study_on_Mechanical_Turk_AMCIS_2011.pdf | url-status = dead | archive-url = https://web.archive.org/web/20120227173340/http://schader.bwl.uni-mannheim.de/fileadmin/files/publikationen/Kaufmann_Schulze_Veit_2011_-_More_than_fun_and_money_Worker_motivation_in_Crowdsourcing_-_A_Study_on_Mechanical_Turk_AMCIS_2011.pdf | archive-date = 2012-02-27 }}{{cite journal | last = Brabham | first = Daren C. | title = Motivations for Participation in a Crowdsourcing Application to Improve Public Engagement in Transit Planning | year = 2012 | journal = Journal of Applied Communication Research | doi=10.1080/00909882.2012.693940 | volume=40 | issue = 3 | pages=307–328| s2cid = 144807388 }}{{cite journal | last1 = Lietsala | first1 = Katri | last2 = Joutsen | first2 = Atte | title = Hang-a-rounds and True Believers: A Case Analysis of the Roles and Motivational Factors of the Star Wreck Fans | year = 2007 | journal = MindTrek 2007 Conference Proceedings}}{{Cite journal |last1=Dahlander |first1=Linus |last2=Piezunka |first2=Henning |date=2014-06-01 |title=Open to suggestions: How organizations elicit suggestions through proactive and reactive attention |url=https://linkinghub.elsevier.com/retrieve/pii/S0048733313001108 |journal=Research Policy |series=Open Innovation: New Insights and Evidence |volume=43 |issue=5 |pages=812–827 |doi=10.1016/j.respol.2013.06.006 |issn=0048-7333|url-access=subscription }} For example, people employed in a full-time position rate human capital advancement as less important than part-time workers do, while women rate social contact as more important than men do.
Intrinsic motivations are broken down into two categories: enjoyment-based and community-based motivations. Enjoyment-based motivations refer to motivations related to the fun and enjoyment contributors experience through their participation. These motivations include: skill variety, task identity, task autonomy, direct feedback from the job, and taking the job as a pastime.{{Citation needed|date=September 2022}} Community-based motivations refer to motivations related to community participation, and include community identification and social contact. In crowdsourced journalism, the motivation factors are intrinsic: the crowd is driven by a possibility to make social impact, contribute to social change, and help their peers.
Extrinsic motivations are broken down into three categories: immediate payoffs, delayed payoffs, and social motivations. Immediate payoffs, through monetary payment, are the immediately received compensations given to those who complete tasks. Delayed payoffs are benefits that can be used to generate future advantages, such as training skills and being noticed by potential employers. Social motivations are the rewards of behaving pro-socially,{{cite web|title=State of the World's Volunteerism Report 2011|url=http://www.unv.org/fileadmin/docdb/pdf/2011/SWVR/English/SWVR2011_full.pdf|publisher=Unv.org|access-date=2015-07-01|url-status=dead|archive-url=https://web.archive.org/web/20141202072036/http://www.unv.org/fileadmin/docdb/pdf/2011/SWVR/English/SWVR2011_full.pdf|archive-date=2014-12-02}} such as the altruistic motivations of online volunteers. Chandler and Kapelner found that US users of the Amazon Mechanical Turk were more likely to complete a task when told they were going to help researchers identify tumor cells, than when they were not told the purpose of their task. However, of those who completed the task, quality of output did not depend on the framing.{{cite journal | last1 = Chandler | first1 = D. | last2 = Kapelner | first2 =A. | title = Breaking Monotony with Meaning: Motivation in Crowdsourcing Markets | journal = Journal of Economic Behavior & Organization | year = 2010| volume = 90 | pages = 123–133 | doi = 10.1016/j.jebo.2013.03.003 | arxiv = 1210.0962 | s2cid = 8563262 | url = http://www.danachandler.com/files/Chandler_Kapelner_BreakingMonotonyWithMeaning.pdf }}
Motivation in crowdsourcing is often a mix of intrinsic and extrinsic factors.{{Cite book | last1 = Aparicio | first1 = M. | last2 = Costa | first2 =C. |last3 = Braga | first3 =A. | title = Proceedings of the Workshop on Open Source and Design of Communication | chapter = Proposing a system to support crowdsourcing | pages = 13–17 | year = 2012| url = https://www.researchgate.net/publication/232659015 |format=PDF| doi = 10.1145/2316936.2316940 | isbn = 9781450315258 | s2cid = 16494503 }} In a crowdsourced law-making project, the crowd was motivated by both intrinsic and extrinsic factors. Intrinsic motivations included fulfilling civic duty, affecting the law for sociotropic reasons, to deliberate with and learn from peers. Extrinsic motivations included changing the law for financial gain or other benefits. Participation in crowdsourced policy-making was an act of grassroots advocacy, whether to pursue one's own interest or more altruistic goals, such as protecting nature. Participants in online research studies report their motivation as both intrinsic enjoyment and monetary gain.{{Cite book |last=Ipeirotis |first=Panagiotis G. |date=2010-03-10 |title=Demographics of Mechanical Turk |url=http://archive.nyu.edu/handle/2451/29585 }}{{Cite book |last1=Ross |first1=Joel |last2=Irani |first2=Lilly |last3=Silberman |first3=M. Six |last4=Zaldivar |first4=Andrew |last5=Tomlinson |first5=Bill |title=CHI '10 Extended Abstracts on Human Factors in Computing Systems |chapter=Who are the crowdworkers? |date=2010-04-10 |series=CHI EA '10 |location=New York, USA |publisher=Association for Computing Machinery |pages=2863–2872 |doi=10.1145/1753846.1753873 |isbn=978-1-60558-930-5|s2cid=11386257 }}{{Cite web |last1=Moss |first1=Aaron |last2=Rosenzweig |first2=Cheskie |last3=Robinson |first3=Jonathan |last4=Jaffe |first4=Shalom |last5=Litman |first5=Leib |date=2022 |title=Is it Ethical to Use Mechanical Turk for Behavioral Research? Relevant Data from a Representative Survey of MTurk Participants and Wages |url=https://psyarxiv.com/jbc9d/ |access-date=2023-01-12 |website=psyarxiv.com}}
Another form of social motivation is prestige or status. The International Children's Digital Library recruited volunteers to translate and review books. Because all translators receive public acknowledgment for their contributions, Kaufman and Schulz cite this as a reputation-based strategy to motivate individuals who want to be associated with institutions that have prestige. The Mechanical Turk uses reputation as a motivator in a different sense, as a form of quality control. Crowdworkers who frequently complete tasks in ways judged to be inadequate can be denied access to future tasks, whereas workers who pay close attention may be rewarded by gaining access to higher-paying tasks or being on an "Approved List" of workers. This system may incentivize higher-quality work.{{cite web|url=http://alexquinn.org/papers/Human%20Computation,%20A%20Survey%20and%20Taxonomy%20of%20a%20Growing%20Field%20(CHI%202011).pdf |first1=Alexander J. |last1=Quinn |first2= Benjamin B. |last2=Bederson|year= 2011|title=Human Computation:A Survey and Taxonomy of a Growing Field, CHI 2011 [Computer Human Interaction conference], May 7–12, 2011, Vancouver, BC, Canada| access-date= 30 June 2015}} However, this system only works when requesters reject bad work, which many do not.{{Cite journal |last1=Hauser |first1=David J. |last2=Moss |first2=Aaron J. |last3=Rosenzweig |first3=Cheskie |last4=Jaffe |first4=Shalom N. |last5=Robinson |first5=Jonathan |last6=Litman |first6=Leib |date=2022-11-03 |title=Evaluating CloudResearch's Approved Group as a solution for problematic data quality on MTurk |journal=Behavior Research Methods |volume=55 |issue=8 |pages=3953–3964 |doi=10.3758/s13428-022-01999-x |pmid=36326997|doi-access=free |pmc=10700412 }}
Despite the potential global reach of IT applications online, recent research illustrates that differences in location{{which|date=May 2016}} affect participation outcomes in IT-mediated crowds.{{cite book|last=Prpić| first=J|author2=Shukla, P. |author3=Roth, Y. |author4=Lemoine, J.F. |year=2015 |chapter=A Geography of Participation in IT-Mediated Crowds |title=Proceedings of the Hawaii International Conference on Systems Sciences 2015.|ssrn=2494537}}
Limitations and controversies
While there it lots of anecdotal evidence that illustrates the potential of crowdsourcing and the benefits that organizations have derived, there is scientific evidence that crowdsourcing initiatives often fail.{{Cite journal |last1=Dahlander |first1=Linus |last2=Piezunka |first2=Henning |date=2020-12-09 |title=Why crowdsourcing fails |journal=Journal of Organization Design |language=en |volume=9 |issue=1 |pages=24 |doi=10.1186/s41469-020-00088-7 |doi-access=free |issn=2245-408X|hdl=10419/252174 |hdl-access=free }} At least six major topics cover the limitations and controversies about crowdsourcing:
- Failure to attract contributions
- Impact of crowdsourcing on product quality
- Entrepreneurs contribute less capital themselves
- Increased number of funded ideas
- The value and impact of the work received from the crowd
- The ethical implications of low wages paid to workers
- Trustworthiness and informed decision making
= Failure to attract contributions =
Crowdsourcing initiatives often fail to attract sufficient or beneficial contributions. The vast majority of crowdsourcing initiatives hardly attract contributions; an analysis of thousands of organizations' crowdsourcing initiatives illustrates that only the 90th percentile of initiatives attracts more than one contribution a month. While crowdsourcing initiatives may be effective in isolation, when faced with competition they mail fail to attract sufficient contributions. Nagaraj and Piezunka (2024) illustrate that OpenStreetMap struggled to attract contributions once Google Maps entered a country.
=Impact of crowdsourcing on product quality=
Crowdsourcing allows anyone to participate, allowing for many unqualified participants and resulting in large quantities of unusable contributions.{{Cite news|date=2023-06-16|title=How Generative AI Can Augment Human Creativity|work=Harvard Business Review|url=https://hbr.org/2023/07/how-generative-ai-can-augment-human-creativity|access-date=2023-06-20|issn=0017-8012}} Companies, or additional crowdworkers, then have to sort through the low-quality contributions. The task of sorting through crowdworkers' contributions, along with the necessary job of managing the crowd, requires companies to hire actual employees, thereby increasing management overhead.{{cite web |title= The Case For and Against Crowdsourcing: Part 2 |url= http://www.crowdsourcing.org/editorial/the-case-for-and-against-crowdsourcing-part-2/2850 |first= Irma |last= Borst |access-date= 2015-02-09 |archive-url= https://web.archive.org/web/20150912024759/http://www.crowdsourcing.org/editorial/the-case-for-and-against-crowdsourcing-part-2/2850 |archive-date= 2015-09-12 |url-status= usurped }} For example, susceptibility to faulty results can be caused by targeted, malicious work efforts. Since crowdworkers completing microtasks are paid per task, a financial incentive often causes workers to complete tasks quickly rather than well. Verifying responses is time-consuming, so employers often depend on having multiple workers complete the same task to correct errors. However, having each task completed multiple times increases time and monetary costs.{{cite book|last1=Ipeirotis |last2=Provost |last3=Wang |year=2010 |title=Quality Management on Amazon Mechanical Turk |url=http://people.stern.nyu.edu/panos/publications/hcomp2010.pdf |access-date=2012-02-28 |archive-url=https://web.archive.org/web/20120809230548/http://people.stern.nyu.edu/panos/publications/hcomp2010.pdf |archive-date=2012-08-09 |url-status=dead }} Some companies, like [https://www.cloudresearch.com/ CloudResearch], control data quality by repeatedly vetting crowdworkers to ensure they are paying attention and providing high-quality work.
Crowdsourcing quality is also impacted by task design. Lukyanenko et al.{{cite journal|last1=Lukyanenko|first1=Roman|last2=Parsons|first2=Jeffrey|last3=Wiersma|first3=Yolanda|title=The IQ of the Crowd: Understanding and Improving Information Quality in Structured User-Generated Content|journal=Information Systems Research|date=2014|volume=25|issue=4|pages=669–689|doi=10.1287/isre.2014.0537}} argue that, the prevailing practice of modeling crowdsourcing data collection tasks in terms of fixed classes (options), unnecessarily restricts quality. Results demonstrate that information accuracy depends on the classes used to model domains, with participants providing more accurate information when classifying phenomena at a more general level (which is typically less useful to sponsor organizations, hence less common).{{Clarify|date=September 2022}} Further, greater overall accuracy is expected when participants could provide free-form data compared to tasks in which they select from constrained choices. In behavioral science research, it is often recommended to include open-ended responses, in addition to other forms of attention checks, to assess data quality.{{Citation |last1=Hauser |first1=David |title=Evidence and Solutions |url=https://www.taylorfrancis.com/chapters/edit/10.4324/9781351137713-17/common-concerns-mturk-participant-pool-david-hauser-gabriele-paolacci-jesse-chandler |work=Handbook of Research Methods in Consumer Psychology |doi=10.4324/9781351137713-17 |access-date=2023-01-12 |last2=Paolacci |first2=Gabriele |last3=Chandler |first3=Jesse|date=15 April 2019 |isbn=9781351137713 |s2cid=150882624 |url-access=subscription }}{{Cite book|last1=Moss |first1=Aaron J |last2=Rosenzweig |first2=Cheskie |last3=Jaffe |first3=Shalom Noach |last4=Gautam |first4=Richa |last5=Robinson |first5=Jonathan |last6=Litman |first6=Leib |date=2021-06-11 |title=Bots or inattentive humans? Identifying sources of low-quality data in online platforms |url=https://osf.io/wr8ds |doi=10.31234/osf.io/wr8ds|s2cid=236288817 }}
Just as limiting, oftentimes there is not enough skills or expertise in the crowd to successfully accomplish the desired task. While this scenario does not affect "simple" tasks such as image labeling, it is particularly problematic for more complex tasks, such as engineering design or product validation. A comparison between the evaluation of business models from experts and an anonymous online crowd showed that an anonymous online crowd cannot evaluate business models to the same level as experts.{{Cite journal|last1=Goerzen|first1=Thomas|last2=Kundisch|first2=Dennis|date=2016-08-11|title=Can the Crowd Substitute Experts in Evaluation of Creative Ideas? An Experimental Study Using Business Models|url=https://aisel.aisnet.org/amcis2016/Virtual/Presentations/10|journal=AMCIS 2016 Proceedings}} In these cases, it may be difficult or even impossible to find qualified people in the crowd, as their responses represent only a small fraction of the workers compared to consistent, but incorrect crowd members.{{cite book|last1=Burnap|first1=Alex|last2=Ren|first2=Alex J.|last3=Papazoglou|first3=Giannis|last4=Gerth|first4=Richard|last5=Gonzalez|first5=Richard|last6=Papalambros|first6=Panos|title=When Crowdsourcing Fails: A Study of Expertise on Crowdsourced Design Evaluation|url=http://ode.engin.umich.edu/publications/PapalambrosPapers/2015/316J.pdf|access-date=2015-05-19|archive-url=https://web.archive.org/web/20151029001614/http://ode.engin.umich.edu/publications/PapalambrosPapers/2015/316J.pdf|archive-date=2015-10-29|url-status=dead}} However, if the task is "intermediate" in its difficulty, estimating crowdworkers' skills and intentions and leveraging them for inferring true responses works well,{{cite journal|last1=Kurve|first1=Aditya|last2=Miller|first2=David J.|last3=Kesidis|first3=George|title=Multicategory Crowdsourcing Accounting for Variable Task Difficulty, Worker Skill, and Worker Intention|journal=IEEE Kde|date=30 May 2014|issue=99}} albeit with an additional computation cost.{{Citation needed|date=September 2022}}
Crowdworkers are a nonrandom sample of the population. Many researchers use crowdsourcing to quickly and cheaply conduct studies with larger sample sizes than would be otherwise achievable. However, due to limited access to the Internet, participation in low developed countries is relatively low. Participation in highly developed countries is similarly low, largely because the low amount of pay is not a strong motivation for most users in these countries. These factors lead to a bias in the population pool towards users in medium developed countries, as deemed by the human development index.{{Citation | last1 = Hirth | last2 =Hoßfeld |last3=Tran-Gia |year=2011 |title=Human Cloud as Emerging Internet Application – Anatomy of the Microworkers Crowdsourcing Platform |url=http://www3.informatik.uni-wuerzburg.de/TR/tr478.pdf }} Participants in these countries sometimes masquerade as U.S. participants to gain access to certain tasks. This led to the "bot scare" on Amazon Mechanical Turk in 2018, when researchers thought bots were completing research surveys due to the lower quality of responses originating from medium-developed countries.{{Cite web |last=PhD |first=Aaron Moss |date=2018-09-18 |title=After the Bot Scare: Understanding What's Been Happening With Data Collection on MTurk and How to Stop It |url=https://www.cloudresearch.com/resources/blog/after-the-bot-scare-understanding-whats-been-happening-with-data-collection-on-mturk-and-how-to-stop-it/ |access-date=2023-01-12 |website=CloudResearch }}
The likelihood that a crowdsourced project will fail due to lack of monetary motivation or too few participants increases over the course of the project. Tasks that are not completed quickly may be forgotten, buried by filters and search procedures. This results in a long-tail power law distribution of completion times.{{cite journal|last1=Ipeirotis|first1=Panagiotis G.|year=2010|title=Analyzing the Amazon Mechanical Turk Marketplace|url=https://archive.nyu.edu/bitstream/2451/29801/4/CeDER-10-04.pdf|journal=XRDS: Crossroads, the ACM Magazine for Students|volume=17|issue=2|pages=16–21|doi=10.1145/1869086.1869094|ssrn=1688194|s2cid=6472586|access-date=2 October 2018}} Additionally, low-paying research studies online have higher rates of attrition, with participants not completing the study once started. Even when tasks are completed, crowdsourcing does not always produce quality results. When Facebook began its localization program in 2008, it encountered some criticism for the low quality of its crowdsourced translations.{{cite web | url = https://www.nbcnews.com/id/wbna24205912 | title = Facebook asks users to translate for free | work = NBC News | first= Tomoko A.|last= Hosaka |date=April 2008}} One of the problems of crowdsourcing products is the lack of interaction between the crowd and the client. Usually little information is known about the final product, and workers rarely interacts with the final client in the process. This can decrease the quality of product as client interaction is considered to be a vital part of the design process.{{cite web |title= Crowdsourcing: The Debate Roars On |url= http://insite.artinstitutes.edu/crowdsourcing-the-debate-roars-on-39739.aspx |first= Darice |last= Britt |access-date= 2012-12-04 |archive-url= https://web.archive.org/web/20140701173128/http://insite.artinstitutes.edu/crowdsourcing-the-debate-roars-on-39739.aspx |archive-date= 2014-07-01 |url-status= dead }}
An additional cause of the decrease in product quality that can result from crowdsourcing is the lack of collaboration tools. In a typical workplace, coworkers are organized in such a way that they can work together and build upon each other's knowledge and ideas. Furthermore, the company often provides employees with the necessary information, procedures, and tools to fulfill their responsibilities. However, in crowdsourcing, crowd-workers are left to depend on their own knowledge and means to complete tasks.
A crowdsourced project is usually expected to be unbiased by incorporating a large population of participants with a diverse background. However, most of the crowdsourcing works are done by people who are paid or directly benefit from the outcome (e.g. most of open source projects working on Linux). In many other cases, the end product is the outcome of a single person's endeavor, who creates the majority of the product, while the crowd only participates in minor details.{{cite news |title = The Myth of Crowdsourcing |url= https://www.forbes.com/2009/09/28/crowdsourcing-enterprise-innovation-technology-cio-network-jargonspy.html | first= Dan |last=Woods | date=28 September 2009 | access-date= 2012-12-04 | work=Forbes}}
= Entrepreneurs contribute less capital themselves =
To make an idea turn into a reality, the first component needed is capital. Depending on the scope and complexity of the crowdsourced project, the amount of necessary capital can range from a few thousand dollars to hundreds of thousands, if not more. The capital-raising process can take from days to months depending on different variables, including the entrepreneur's network and the amount of initial self-generated capital.{{Citation needed|date=September 2022}}
The crowdsourcing process allows entrepreneurs to access a wide range of investors who can take different stakes in the project.{{Cite journal|url=https://www.academia.edu/963662 |title=The Promise of Idea Crowdsourcing: Benefits, Contexts, Limitations |website=Ideasproject.com |date= |access-date=2015-07-02|last1=Aitamurto |first1=Tanja |last2=Leiponen |first2=Aija }} As an effect, crowdsourcing simplifies the capital-raising process and allows entrepreneurs to spend more time on the project itself and reaching milestones rather than dedicating time to get it started. Overall, the simplified access to capital can save time to start projects and potentially increase the efficiency of projects.{{Citation needed|date=September 2022}}
Others argue that easier access to capital through a large number of smaller investors can hurt the project and its creators. With a simplified capital-raising process involving more investors with smaller stakes, investors are more risk-seeking because they can take on an investment size with which they are comfortable. This leads to entrepreneurs losing possible experience convincing investors who are wary of potential risks in investing because they do not depend on one single investor for the survival of their project. Instead of being forced to assess risks and convince large institutional investors on why their project can be successful, wary investors can be replaced by others who are willing to take on the risk.
Some translation companies and translation tool consumers pretend to use crowdsourcing as a means for drastically cutting costs, instead of hiring professional translators. This situation has been systematically denounced by IAPTI and other translator organizations.{{cite web |url=http://www.laht.com/article.asp?ArticleId=344753&CategoryId=14093 |title=International Translators Association Launched in Argentina |work=Latin American Herald Tribune |access-date=23 November 2016 |archive-date=11 March 2021 |archive-url=https://web.archive.org/web/20210311031022/http://www.laht.com/article.asp?ArticleId=344753&CategoryId=14093 |url-status=dead }}
= Increased number of funded ideas =
The raw number of ideas that get funded and the quality of the ideas is a large controversy over the issue of crowdsourcing.
Proponents argue that crowdsourcing is beneficial because it allows the formation of startups with niche ideas that would not survive venture capitalist or angel funding, which are oftentimes the primary investors in startups. Many ideas are scrapped in their infancy due to insufficient support and lack of capital, but crowdsourcing allows these ideas to be started if an entrepreneur can find a community to take interest in the project.{{cite web|last=Kleeman|first= Frank |year=2008 |title=Un(der)paid Innovators: The Commercial Utilization of Consumer Work through Crowdsourcing| url=http://www.sti-studies.de/ojs/index.php/sti/article/view/81/62|publisher=Sti-studies.de|access-date=2015-07-02}}
Crowdsourcing allows those who would benefit from the project to fund and become a part of it, which is one way for small niche ideas get started.{{cite web|author=Jason|year=2011|title=Crowdsourcing: A Million Heads is Better Than One|url=http://www.crowdsourcing.org/document/crowdsourcing-a-million-heads-is-better-than-one/8619|publisher=Crowdsourcing.org|access-date=2015-07-02|archive-url=https://web.archive.org/web/20150703021755/http://www.crowdsourcing.org/document/crowdsourcing-a-million-heads-is-better-than-one/8619|archive-date=2015-07-03|url-status=usurped}} However, when the number of projects grows, the number of failures also increases. Crowdsourcing assists the development of niche and high-risk projects due to a perceived need from a select few who seek the product. With high risk and small target markets, the pool of crowdsourced projects faces a greater possible loss of capital, lower return, and lower levels of success.{{cite web|last=Dupree|first= Steven |year=2014 |title=Crowdfunding 101: Pros and Cons| url=http://www.gsb.stanford.edu/ces/crowdfunding-101|publisher=Gsb.stanford.edu|access-date=2015-07-02}}
= Other concerns =
Besides insufficient compensation and other labor-related disputes, there have also been concerns regarding privacy violations, the hiring of vulnerable groups, breaches of anonymity, psychological damage, the encouragement of addictive behaviors, and more.{{Cite book |last1=Shmueli |first1=Boaz |last2=Fell |first2=Jan |last3=Ray |first3=Soumya |last4=Ku |first4=Lun-Wei |title=Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies |chapter=Beyond Fair Pay: Ethical Implications of NLP Crowdsourcing |date=2021 |chapter-url=https://aclanthology.org/2021.naacl-main.295 |language=en |publisher=Association for Computational Linguistics |pages=3758–3769 |doi=10.18653/v1/2021.naacl-main.295|s2cid=233307331 }} Many but not all of the issues related to crowdworkes overlap with concerns related to content moderators.
See also
{{div col}}
- chronolog - a citizen science environmental monitoring platform.
- {{annotated link|Citizen science}}
- {{annotated link|Clickworkers}}
- {{annotated link|Collaborative innovation network}}
- {{annotated link|Collaborative mapping}}
- {{annotated link|Collective consciousness}}
- {{annotated link|Collective intelligence}}
- {{annotated link|Collective problem solving}}
- {{annotated link|Commons-based peer production}}
- {{annotated link|Crowd computing}}
- {{annotated link|Crowdcasting}}
- {{annotated link|Crowdfixing}}
- {{annotated link|Crowdsourcing software development}}
- {{annotated link|Distributed thinking}}
- {{annotated link|Distributed Proofreaders}}
- {{annotated link|Flash mob}}
- Folksonomy
- {{annotated link|Gamification}}
- {{annotated link|Government crowdsourcing}}
- {{annotated link|List of crowdsourcing projects}}
- Models of collaborative tagging
- {{annotated link|Microcredit}}
- {{annotated link|Participatory democracy}}
- {{annotated link|Participatory monitoring}}
- {{annotated link|Open knowledge}}
- {{annotated link|Smart mob}}
- {{annotated link|Social collaboration}}
- {{annotated link|Stone Soup}}
- {{annotated link|Truecaller}}
- {{annotated link|Virtual collective consciousness}}
- {{annotated link|Virtual volunteering}}
- {{annotated link|Wisdom of the crowd}}
- {{annotated link|Wiki survey}}
- {{annotated link|Crowdsource (app)}}
{{div col end}}
References
{{Reflist}}
External links
- {{Wikibooks inline}}
- {{Commons category-inline}}
{{Open navbox}}
{{Sharing economy}}
{{Authority control}}