From Wikipedia, the free encyclopedia Blog Ref http://www.p2pfoundation.net/Universal_Debating_Project
Crowdsourcing is the process of obtaining needed services, ideas, or content by soliciting contributions from a large group of people, and especially from an online community, rather than from traditional employees or suppliers.[1] While this definition from Merriam Webster is valid, a more specific definition is heavily debated.[2] The process of crowdsourcing is often used to subdivide tedious work[3] and has occurred successfully offline−see the examples below. It combines the efforts of numerous self-identified volunteers or part-time workers, where each contributor of their own initiative adds a small portion to the greater result. The term "crowdsourcing" is a portmanteau of "crowd" and "outsourcing"; it is distinguished from outsourcing in that the work comes from an undefined public rather than being commissioned from a specific, named group.
Coined in 2005, the word "crowdsourcing" can apply to a wide range of activities.[4] Crowdsourcing can involve division of labor for tedious tasks split to use crowd-based outsourcing, but it can also apply to specific requests, such as crowdfunding, a broad-based competition, and a general search for answers, solutions, or a missing person. Crowdtesting is another example of the utilization of the crowd to provide software testing services. Crowdtesting is becoming a major player in the software world with recent studies stating that 55% of companies have adopted crowdsourced services in 2014 and more plan to utilize crowdtesters in 2015 and moving forward.[citation needed]
After studying more than 40 definitions of crowdsourcing in the scientific and popular literature, Enrique Estellés-Arolas and Fernando González Ladrón-de-Guevara, researchers at the Technical University of Valencia, developed a new integrating definition:[2]
Another consequence of the multiple definitions is the controversy surrounding what kinds of activities can be considered crowdsourcing. For more information, see Controversies.
A modern day version of crowdsourcing in astronomy is NASA's photo organizing project[18] which asks internet users to browse photos taken from space and try to identify the location the picture is documenting.[19]
Institutes that have records of interest to genealogical research have used crowds of volunteers to create catalogs and indexes to records.
Spencer Wells, Phd., Director of the Genographic Project blurb:
In response to a challenge from the French government, Nicolas Appert won a prize for inventing a new way of food preservation that involved sealing food in air-tight jars.[35] The British government provided a similar reward to find an easy way to determine a ship's longitude in the The Longitude Prize. During the Great Depression, out-of-work clerks tabulated higher mathematical functions in the Mathematical Tables Project as an outreach project.[36] One of the biggest crowdsourcing campaigns was a public design contest in 2010 hosted by the Indian Government's finance ministry to create a symbol for the Indian rupee. Thousands of people sent in entries before the government zeroed in on the final symbol based on the Devanagari script using the letter Ra.[37]
According to a definition by Henk van Ess:[39]
With explicit crowdsourcing, users can evaluate particular items like books or webpages, or share by posting products or items. Users can also build artifacts by providing information and editing other people's work.
Implicit crowdsourcing can take two forms: standalone and piggyback. Standalone allows people to solve problems as a side effect of the task they are actually doing, whereas piggyback takes users' information from a third-party website to gather information.[40]
In his 2013 book, Crowdsourcing, Daren C. Brabham puts forth a problem-based typology of crowdsourcing approaches:[41]
Some of the most famous examples have made use of social media channels: Domino's Pizza, Coca Cola, Heineken and Sam Adams have thus crowdsourced a new pizza, bottle design, beer and song, respectively.[44] Threadless.com selects the T-shirts it sells by having users provide designs and vote on the ones they like, which are then printed and available for purchase.[7]
The California Report Card (CRC), a program jointly launched in January 2014 by the Center for Information Technology Research in the Interest of Society[45] and Lt. Governor Gavin Newsom, is an example of modern-day crowd voting. Participants access the CRC on line and vote on six timely issues. Through principal component analysis, the users are then placed into an online "café" in which they can present their own political opinions and grade the suggestions of other participants. This system aims to effectively involve the greater public in relevant political discussions and highlight the specific topics with which Californians are most concerned.
In 2009, two researchers, founders of Movie Crowd-wisdom Intelligence Strategy, provide the proof-of-concept of crowdvoting's value in the movie industry.[46][47] Their findings show that the crowd can accurately predict the success or failure of a movie based on its trailer. In 2013, a white paper from Google confirms these results.[48]
Individuals, businesses, and entrepreneurs can showcase their businesses and projects to the entire world by creating a profile, typically includes a short video, introducing their project, a list of rewards per donation, and illustrations through images. The idea is to create a compelling message that readers will be drawn towards. Funders make monetary contribution for numerous reasons:
Crowdfunding sites [63]
Another example of competition-based crowdsourcing is the 2009 DARPA balloon experiment, where DARPA placed 10 balloon markers across the United States and challenged teams to compete to be the first to report the location of all the balloons. A collaboration of efforts was required to complete the challenge quickly and in addition to the competitive motivation of the contest as a whole, the winning team (MIT, in less than nine hours) established its own "collaborapetitive" environment to generate participation in their team.[70] A similar challenge was the Tag Challenge, funded by the US State Department, which required locating and photographing individuals in 5 cities in the US and Europe within 12 hours based only on a single photograph. The winning team managed to locate 3 suspects by mobilizing volunteers world-wide using a similar incentive scheme to the one used in the Balloon Challenge.[71]
Open innovation platforms are a very effective way of crowdsourcing people's thoughts and ideas to do research and development. The company InnoCentive is a crowdsourcing platform for corporate research and development where difficult scientific problems are posted for crowds of solvers to discover the answer and win a cash prize, which can range from $10,000 to $100,000 per challenge.[7] InnoCentive, of Waltham, MA and London, England provides access to millions of scientific and technical experts from around the world. The company claims a success rate of 50% in providing successful solutions to previously unsolved scientific and technical problems. IdeaConnection.com challenges people to come up with new inventions and innovations and Ninesigma.com connects clients with experts in various fields. The X PRIZE Foundation creates and runs incentive competitions offering between $1 million and $30 million for solving challenges. Local Motors is another example of crowdsourcing. A community of 20,000 automotive engineers, designers and enthusiasts competes to build off-road rally trucks.[72]
A good example of implicit crowdsourcing is the ESP game, where users guess what images are and then these labels are used to tag Google images. Another popular use of implicit crowdsourcing is through reCAPTCHA, which asks people to solve CAPTCHAs to prove they are human, and then provides CAPTCHAs from old books that cannot be deciphered by computers, to digitize them for the web. Like many tasks solved using the Mechanical Turk, CAPTCHAs are simple for humans but often very difficult for computers.[40]
Piggyback crowdsourcing can be seen most frequently by websites such as Google that data-mine a user's search history and websites in order to discover keywords for ads, spelling corrections, and finding synonyms. In this way, users are unintentionally helping to modify existing systems, such as Google's AdWords.[73]
Crowdsourcing also has the potential to be a problem-solving mechanism for government and nonprofit use.[77] Urban and transit planning are prime areas for crowdsourcing. One project to test crowdsourcing's public participation process for transit planning in Salt Lake City was carried out from 2008 to 2009, funded by a U.S. Federal Transit Administration grant.[78] Another notable application of crowdsourcing to government problem solving is the Peer to Patent Community Patent Review project for the U.S. Patent and Trademark Office.[79]
Researchers have used crowdsourcing systems (in particular, the Mechanical Turk) to aid with research projects by crowdsourcing some aspects of the research process, such as data collection, parsing, and evaluation. Notable examples include using the crowd to create speech and language databases,[80][81] and using the crowd to conduct user studies.[73] Crowdsourcing systems provide these researchers with the ability to gather large amount of data. Additionally, using crowdsourcing, researchers can collect data from populations and demographics they may not have had access to locally, but that improve the validity and value of their work.[82]
Artists have also utilized crowdsourcing systems. In his project the Sheep Market, Aaron Koblin used Mechanical Turk to collect 10,000 drawings of sheep from contributors around the world.[83] Sam Brown (artist) leverages the crowd by asking visitors of his website explodingdog to send him sentences that he uses as inspirations for paintings.[84] Art curator Andrea Grover argues that individuals tend to be more open in crowdsourced projects because they are not being physically judged or scrutinized.[38] As with other crowdsourcers, artists use crowdsourcing systems to generate and collect data. The crowd also can be used to provide inspiration and to collect financial support for an artist's work.[85]
Additionally, crowdsourcing from 100 million drivers is being used by INRIX to collect users' driving times to provide better GPS routing and real-time traffic updates.[86]
The average US user of Mechanical Turk earned $2.30 per hour for tasks in 2009, versus $1.58 for the average Indian worker.[citation needed] While the majority of users worked less than five hours per week, 18% worked 15 hours per week or more. This is less than minimum wage in the United States (but not in India), which Ross suggests raises ethical questions for researchers who use crowdsourcing.
The demographics of Microworkers.com differ from Mechanical Turk in that the US and India together account for only 25% of workers. 197 countries are represented among users, with Indonesia (18%) and Bangladesh (17%) contributing the largest share. However, 28% of employers are from the US.[88]
Another study of the demographics of the crowd at iStockphoto found a crowd that was largely white, middle- to upper-class, higher educated, worked in a so-called "white collar job" and had a high-speed Internet connection at home.[89]
Studies have also found that crowds are not simply collections of amateurs or hobbyists. Rather, crowds are often professionally trained in a discipline relevant to a given crowdsourcing task and sometimes hold advanced degrees and many years of experience in the profession.[89][90][91][92] Claiming that crowds are amateurs, rather than professionals, is both factually untrue and may lead to marginalization of crowd labor rights.[93]
G. D. Saxton et al. (2013) studied the role of community users, among other elements, during his content analysis of 103 crowdsourcing organizations. Saxton et al. developed a taxonomy of nine crowdsourcing models (intermediary model, citizen media production, collaborative software development, digital goods sales, product design, peer-to-peer social financing, consumer report model, knowledge base building model, and collaborative science project model) in which to categorize the roles of community users, such as researcher, engineer, programmer, journalist, graphic designer, etc., and the products and services developed.[94]
Intrinsic motivations are broken down into two categories: enjoyment-based and community-based motivations. Enjoyment-based motivations refer to motivations related to the fun and enjoyment that the contributor experiences through their participation. These motivations include: skill variety, task identity, task autonomy, direct feedback from the job, and pastime. Community-based motivations refer to motivations related to community participation, and include community identification and social contact.
Extrinsic motivations are broken down into three categories: immediate payoffs, delayed payoffs, and social motivations. Immediate payoffs, through monetary payment, are the immediately received compensations given to those who complete tasks. Delayed payoffs are benefits that can be used to generate future advantages, such as training skills and being noticed by potential employers. Social motivations are the rewards of behaving pro-socially,[99] such as the altruistic motivations of online volunteers. Chandler and Kapelner found that US users of the Amazon Mechanical Turk were more likely to complete a task when told they were going to “help researchers identify tumor cells,” than when they were not told the purpose of their task. However, of those who completed the task, quality of output did not depend on the framing of the task.[96]
Another form of social motivation is prestige or status. The International Children's Digital Library recruits volunteers to translate and review books. Because all translators receive public acknowledgment for their contributions, Kaufman and Schulz cite this as a reputation-based strategy to motivate individuals who want to be associated with institutions that have prestige. The Amazon Mechanical Turk uses reputation as a motivator in a different sense, as a form of quality control. Crowdworkers who frequently complete tasks in ways judged to be inadequate can be denied access to future tasks, providing motivation to produce high-quality work.[100]
Just as limiting, oftentimes the scenario is that there is just not enough skills or expertise in the crowd to successfully accomplish the desired task. While this scenario does not affect "simple" tasks such as image labeling, this scenario is particularly problematic for tasks that are more complex such as engineering design or product validation. In these cases, it may be difficult or even impossible to find the qualified people in the crowd, as their voices may be drowned out by consistent, but incorrect crowd members.[105] However, if the difficulty of the task is even "intermediate" in its difficultly, it has also been showed that estimating crowdworkers' skills and intentions and leveraging them for inferring true responses works well,[106] albeit with an additional computation cost.
Crowdworkers are a nonrandom sample of the population. Many researchers use crowdsourcing in order to quickly and cheaply conduct studies with larger sample sizes than would be otherwise achievable. However, due to limited access to internet, participation in low developed countries is relatively low. Participation in highly developed countries is similary low, largely because the low amount of pay is not a strong motivation for most users in these countries. These factors lead to a bias in the population pool towards users in medium developed countries, as deemed by the Human Development Index.[107]
The likelihood that a crowdsourced project will fail due to lack of monetary motivation or too few participants increases over the course of the project. Crowdsourcing markets are not a first-in-first-out queue. Tasks that are not completed quickly may be forgotten, buried by filters and search procedures so that workers do not see them. This results in a long tail power law distribution of completion times.[108] Additionally, low-paying research studies online have higher rates of attrition, with participants not completing the study once started.[82] Even when tasks are completed, crowdsourcing does not always produce quality results. When Facebook began its localization program in 2008, it encountered some criticism for the low quality of its crowdsourced translations.[109]
One of the problems of crowdsourcing products is the lack of interaction between the crowd and the client. Usually there is little information about the final desired product, and there is often very limited interaction with the final client. This can decrease the quality of product because client interaction is a vital part of the design process.[110]
An additional cause of the decrease in product quality that can result from crowdsourcing is the lack of collaboration tools. In a typical workplace, coworkers are organized in such a way that they can work together and build upon each other’s knowledge and ideas. Furthermore, the company often provides employees with the necessary information, procedures, and tools to fulfill their responsibilities. However, in crowdsourcing, crowdworkers are left to depend on their own knowledge and means to complete tasks. [103]
It is usually expected from a crowdsourced project to be unbiased by incorporating a large population of participants with a diverse background. However, most of the crowdsourcing works are done by people who are paid or directly benefit from the outcome (e.g. most of open source projects working on Linux). In many other cases, the end product is the outcome of a single person's endeavour, who creates the majority of the product, while the crowd only participates in minor details.[111]
The crowdsourcing process allows entrepreneurs to access to a wide range of investors who can take different stakes in the project.[112] In effect, crowdsourcing simplifies the capital raising process and allows entrepreneurs to spend more time on the project itself and reaching milestones rather than dedicating time to get it started. Overall, the simplified access to capital can save time to start projects and potentially increase efficiency of projects.
Opponents of this issue argue easier access to capital through a large number of smaller investors can hurt the project and its creators. With a simplified capital raising process involving more investors with smaller stakes, investors are more risk seeking due to the fact that they can take on an investment size they are comfortable with.[112] This leads to entrepreneurs losing possible experience convincing investors who are wary of potential risks in investing because they do not depend on one single investor for the survival of their project. Instead of being forced to assess risks and convince large institutional investors why their project can be successful, wary investors can be replaced by others who are willing to take on the risk.
Proponents argue crowdsourcing is beneficial because it allows niche ideas that would not survive Venture Capitalist or Angel funding, many times the primary investors in startups, to be started. Many ideas are killed in their infancy due to insufficient support and lack of capital but crowdsourcing allows these ideas to be started if an entrepreneur can find a community to take interest in the project.[113]
Crowdsourcing allows those who would benefit from the project to fund and become a part of it which is one way how small niche ideas get started.[114] On the other hand, when the raw number of projects grow, the number of possible failures can also increase. Crowdsourcing assists niche and high-risk projects to start because of a perceived need from a select few who seek the product. With high risk and small target markets, the pool of crowdsourced projects face a greater possible loss of capital, lower return, and lower levels of success.[115]
Typically, no written contracts, non-disclosure agreements, or employee agreements are made with crowdworkers. For users of the Amazon Mechanical Turk, this means that requestors decide whether users' work is acceptable, and reserve the right to withhold pay if it does not meet their standards.[101] Critics say that crowdsourcing arrangements exploit individuals in the crowd, and there has been a call for crowds to organize for their labor rights.[93][119]
Collaboration between crowd members can also be difficult or even discouraged, especially in the context of competitive crowd sourcing. Crowdsourcing site InnoCentive allows organizations to solicit solutions to scientific and technological problems; only 10.6% of respondents report working in a team on their submission.[90]
Coined in 2005, the word "crowdsourcing" can apply to a wide range of activities.[4] Crowdsourcing can involve division of labor for tedious tasks split to use crowd-based outsourcing, but it can also apply to specific requests, such as crowdfunding, a broad-based competition, and a general search for answers, solutions, or a missing person. Crowdtesting is another example of the utilization of the crowd to provide software testing services. Crowdtesting is becoming a major player in the software world with recent studies stating that 55% of companies have adopted crowdsourced services in 2014 and more plan to utilize crowdtesters in 2015 and moving forward.[citation needed]
Contents
[hide]- 1 Definitions
- 2 Historical examples
- 2.1 Timeline of major events
- 2.2 The Oxford English Dictionary
- 2.3 Crowdsourcing in astronomy
- 2.4 Crowdsourcing in journalism
- 2.5 Crowdsourcing in ornithology
- 2.6 Crowdsourcing in genealogy research
- 2.7 Crowdsourcing in genetic genealogy research
- 2.8 Crowdsourcing in public policymaking
- 2.9 Early crowdsourcing competitions
- 3 Modern methods
- 4 Examples
- 5 Crowdsourcers
- 6 Limitations and Controversies
- 7 See also
- 8 References
- 9 Further Reading
- 10 External links
Definitions[edit]
Jeff Howe and Mark Robinson, editors at Wired Magazine, coined the term "crowdsourcing" in 2005 after conversations about how businesses were using the Internet to outsource work to individuals.[4] Howe and Robinson came to the conclusion that what was happening was like "outsourcing to the crowd," which quickly led to the portmanteau "crowdsourcing." Howe first published a definition for the term "crowdsourcing" in a companion blog post to his June 2006 Wired magazine article, "The Rise of Crowdsourcing," which came out in print just days later:[5]"Simply defined, crowdsourcing represents the act of a company or institution taking a function once performed by employees and outsourcing it to an undefined (and generally large) network of people in the form of an open call. This can take the form of peer-production (when the job is performed collaboratively), but is also often undertaken by sole individuals. The crucial prerequisite is the use of the open call format and the large network of potential laborers."In a February 1, 2008 article, Daren C. Brabham, "the first [person] to publish scholarly research using the word crowdsourcing" and writer of the 2013 book, Crowdsourcing, defined it as an "online, distributed problem-solving and production model."[6][7]
After studying more than 40 definitions of crowdsourcing in the scientific and popular literature, Enrique Estellés-Arolas and Fernando González Ladrón-de-Guevara, researchers at the Technical University of Valencia, developed a new integrating definition:[2]
"Crowdsourcing is a type of participative online activity in which an individual, an institution, a non-profit organization, or company proposes to a group of individuals of varying knowledge, heterogeneity, and number, via a flexible open call, the voluntary undertaking of a task. The undertaking of the task, of variable complexity and modularity, and in which the crowd should participate bringing their work, money, knowledge and/or experience, always entails mutual benefit. The user will receive the satisfaction of a given type of need, be it economic, social recognition, self-esteem, or the development of individual skills, while the crowdsourcer will obtain and utilize to their advantage that which the user has brought to the venture, whose form will depend on the type of activity undertaken".Henk van Ess, a college lecturer in online communications, emphasizes the need to "give back" the crowdsourced results to the public on ethical grounds. His non-scientific, non-commercial definition is widely cited in the popular press:[8]
"Crowdsourcing is channeling the experts’ desire to solve a problem and then freely sharing the answer with everyone."Despite the multiple definitions of crowdsourcing, one constant has been the broadcasting of problems to the public, and an open call for contributions to solving the problem. Members of the public submit solutions which are then owned by the entity which broadcast the problem. In some cases, the contributor of the solution is compensated monetarily, with prizes or with recognition. In other cases, the only rewards may be kudos or intellectual satisfaction. Crowdsourcing may produce solutions from amateurs or volunteers, working in their spare time, or from experts or small businesses which were unknown to the initiating organization.[3]
Another consequence of the multiple definitions is the controversy surrounding what kinds of activities can be considered crowdsourcing. For more information, see Controversies.
Historical examples[edit]
While the term "crowdsourcing" was popularized on the Internet to describe Internet-based activities,[7] there are examples of projects that in retrospect can be described as crowdsourcing.Timeline of major events[edit]
Brief timeline of events prior to 2006- 1714 – The Longitude Prize: When the British government was trying to find a way to measure a ship’s longitude, they offered the public a monetary prize to whomever came up with the best solution.[9]
- 1783 – King Louis XVI offered an award to the person who could ‘make the alkali’ by decomposing sea salt by the ‘simplest and most economic method.’[9]
- 1884 – Publication of the Oxford English Dictionary: 800 volunteers catalogued words to create the first fascicle of the OED[9]
- 1916 – Planters Peanuts contest: The Mr. Peanut logo was designed by a fourteen-year-old boy who won the Planter Peanuts logo contest.[9]
- 1957 – “Jørn Utzon selected as winner for design competition for Sydney Opera House” [9]
- 1970 - French amateur photo contest ‘C’était Paris en 1970’ (‘This Was Paris in 1970’) sponsored by the city of Paris, France-Inter radio, and the Fnac: 14,000 photographers produced 70,000 black-and-white prints and 30,000 color slides of the French capital to document the architectural changes of Paris. Photographs were donated to the Bibliothèque historique de la ville de Paris.[10]
- 1996 – The Hollywood Stock Exchange was founded: Allowed for the buying and selling of shares.[9]
- 1997 – British rock band Marillion raised $60,000 from their fans to help finance their U.S. tour.[9]
- 2000 – JustGiving established: This online platform allows the public to help raise money for charities.[9]
- 2000 – UNV Online Volunteering service launched: Connecting people who commit their time and skills over the Internet to help organizations address development challenges[11]
- 2000 – iStockPhoto was founded: The free stock imagery website allows the public to contribute to and receive commission for their contributions.[12]
- 2001 – Launch of Wikipedia: “Free-access, free content Internet encyclopedia”[13]
- 2004 – Toyota’s first “Dream car art” contest: Children were asked globally to draw their ‘dream car of the future.’[14]
- 2005 – Kodak’s “Go for the Gold” contest: Kodak asked anyone to submit a picture of a personal victory.[14]
- 2006 – Jeff Howe coined the term crowdsourcing in Wired (magazine).[12]
The Oxford English Dictionary[edit]
The Oxford English Dictionary (OED) may provide one of the earliest examples of crowdsourcing. In the mid-19th century, an open call for volunteers was made for contributions identifying all words in the English language and example quotations exemplifying their usages.[15] They received over six million submissions over a period of 70 years.[16] The making of the OED is detailed in The Surgeon of Crowthorne (Published in the United States under the title "The Professor and the Madman"), by Simon Winchester.[16]Crowdsourcing in astronomy[edit]
Crowdsourcing in Astronomy was used in the early 19th century by astronomer Denison Olmsted. After being awakened in a late November night due to a meteor shower taking place, Olmsted noticed a pattern in the shooting stars. Olmsted wrote a brief report of this meteor shower in the local newspaper. “As the cause of ‘Falling Stars’ is not understood by meteorologists, it is desirable to collect all the facts attending this phenomenon, stated with as much precision as possible,” Olmsted wrote to readers, in a report subsequently picked up and pooled to newspapers nationwide. Responses came pouring in from many states, along with scientists’ observations sent to the American Journal of Science and Arts.[17] These responses helped him make a series of scientific breakthroughs, the major being that meteor showers are seen nationwide, and fall from space under the influence of gravity. Also, they showed that the showers appeared in yearly cycles, a fact that often eluded scientists. The responses allowed him to suggest a velocity for the meteors, although his estimate turned out to be too conservative. If he had just taken the responses as presented his conjecture on the meteor's velocity would have been closer to their actual speed.A modern day version of crowdsourcing in astronomy is NASA's photo organizing project[18] which asks internet users to browse photos taken from space and try to identify the location the picture is documenting.[19]
Crowdsourcing in journalism[edit]
Crowdsourcing is increasingly used in professional journalism. Journalists crowdsource information from the crowd, typically factcheck the information and then use it in their articles as they see fit. The leading daily newspaper in Sweden has successfully used crowdsourcing in investigating the homeloan interest rates in the country in 2013-2014. The leading daily newspaper in Finland crowdsourced investigation in stock short selling in 2011-2012.[20] TalkingPointsMemo in the United States asked its readers to examine 3000 emails concerning the firing of federal prosecutors in 2008. The British newspaper the Guardian crowdsourced the examination of hundreds of thousands of documents in 2009.[21]Crowdsourcing in ornithology[edit]
Another early example of crowdsourcing occurred in the field of ornithology. On December 25, 1900, Frank Chapman, an early officer of the National Audubon Society initiated a tradition, dubbed the "Christmas Day Bird Census". The project called birders from across North America to count and record the number of birds in each species they witnessed on Christmas Day. The project was successful, and the records from 27 different contributors were compiled into one bird census, which tallied around 90 species of birds.[22] This large scale collection of data constituted an early form of citizen science, the premise on which crowdsourcing is based. In the 2012 census, more than 70,000 individuals participated across 2,369 bird count circles.[23] Christmas 2014 marked the National Audubon Society's 115th annual Christmas Bird Count.Crowdsourcing in genealogy research[edit]
Genealogical research was using crowdsourcing techniques long before personal computers were common. Beginning in 1942, members of The Church of Jesus Christ of Latter-day Saints (LDS Church) encouraged members to submit information about their ancestors. The submitted information was gathered together into a single collection. In 1969 in order to encourage more people to participate in gathering genealogical information about their ancestors, the church started the three-generation program. In this program church members were asked to prepare documented family group record forms for the first three generations. The program was later expanded to encourage members to research at least 4 generations and became known as the four-generation program.[24]Institutes that have records of interest to genealogical research have used crowds of volunteers to create catalogs and indexes to records.
Crowdsourcing in genetic genealogy research[edit]
Genetic genealogy is a combination of traditional genealogy with Genetics. With the rise of personal DNA testing, after the turn of the century, by companies such as Gene by Gene, FTDNA, GeneTree, 23andMe, and Ancestry.com led to public and semi-public databases of DNA testing which uses crowdsourcing techniques. In recent years, citizen science projects have become increasingly focused providing benefits to scientific research.[25][26][27] This includes supporting, organization and dissemination of personal DNA (genetic) testing. Like Amateur astronomy, citizen scientists encouraged by volunteer organizations like ISOGG - the International Society of Genetic Genealogy,[28] have provided valuable information and research to the professional scientific community.[29]Spencer Wells, Phd., Director of the Genographic Project blurb:
Since 2005, the Genographic Project has used the latest genetic technology to expand our knowledge of the human story, and its pioneering use of DNA testing to engage and involve the public in the research effort has helped to create a new breed of "citizen scientist." Geno 2.0 expands the scope for citizen science, harnessing the power of the crowd to discover new details of human population history.[30]
Crowdsourcing in public policymaking[edit]
Governments across the world are increasingly using crowdsourcing for knowledge search and civic engagement. Iceland crowdsourced their constitution reform process in 2011, and Finland has crowdsourced several law reform processes. [31]The House of Representatives in Brazil has used crowdsourcing in policy-reforms, and federal agencies in the United States have used crowdsourcing several years. [32]Early crowdsourcing competitions[edit]
Crowdsourcing has often been used in the past as a competition in order to discover a solution. The French government proposed several of these competitions, often rewarded with Montyon Prizes, created for poor Frenchmen who had done virtuous acts.[33] These included the Leblanc process, or the Alkali Prize, where a reward was provided for separating the salt from the alkali, and the Fourneyron's Turbine, when the first hydraulic commercial turbine was developed.[34]In response to a challenge from the French government, Nicolas Appert won a prize for inventing a new way of food preservation that involved sealing food in air-tight jars.[35] The British government provided a similar reward to find an easy way to determine a ship's longitude in the The Longitude Prize. During the Great Depression, out-of-work clerks tabulated higher mathematical functions in the Mathematical Tables Project as an outreach project.[36] One of the biggest crowdsourcing campaigns was a public design contest in 2010 hosted by the Indian Government's finance ministry to create a symbol for the Indian rupee. Thousands of people sent in entries before the government zeroed in on the final symbol based on the Devanagari script using the letter Ra.[37]
Modern methods[edit]
Today, crowdsourcing has transferred mainly to the Internet. The Internet provides a particularly good venue for crowdsourcing since individuals tend to be more open in web-based projects where they are not being physically judged or scrutinized and thus can feel more comfortable sharing. This ultimately allows for well-designed artistic projects because individuals are less conscious, or maybe even less aware, of scrutiny towards their work. In an online atmosphere, more attention can be given to the specific needs of a project, rather than spending as much time in communication with other individuals.[38]According to a definition by Henk van Ess:[39]
"The crowdsourced problem can be huge (epic tasks like finding alien life or mapping earthquake zones) or very small ('where can I skate safely?'). Some examples of successful crowdsourcing themes are problems that bug people, things that make people feel good about themselves, projects that tap into niche knowledge of proud experts, subjects that people find sympathetic or any form of injustice."Crowdsourcing can either take an explicit or an implicit route. Explicit crowdsourcing lets users work together to evaluate, share and build different specific tasks, while implicit crowdsourcing means that users solve a problem as a side effect of something else they are doing.
With explicit crowdsourcing, users can evaluate particular items like books or webpages, or share by posting products or items. Users can also build artifacts by providing information and editing other people's work.
Implicit crowdsourcing can take two forms: standalone and piggyback. Standalone allows people to solve problems as a side effect of the task they are actually doing, whereas piggyback takes users' information from a third-party website to gather information.[40]
In his 2013 book, Crowdsourcing, Daren C. Brabham puts forth a problem-based typology of crowdsourcing approaches:[41]
- Knowledge Discovery & Management - for information management problems where an organization mobilizes a crowd to find and assemble information. Ideal for creating collective resources.
- Distributed Human Intelligence Tasking - for information management problems where an organization has a set of information in hand and mobilizes a crowd to process or analyze the information. Ideal for processing large data sets that computers cannot easily do.
- Broadcast Search - for ideation problems where an organization mobilizes a crowd to come up with a solution to a problem that has an objective, provable right answer. Ideal for scientific problem solving.
- Peer-Vetted Creative Production - for ideation problems where an organization mobilizes a crowd to come up with a solution to a problem which has an answer that is subjective or dependent on public support. Ideal for design, aesthetic, or policy problems.
Examples[edit]
There are some common categories of crowdsourcing that can be used effectively in the commercial world. Some of these web-based crowdsourcing efforts include crowdvoting, crowdfunding, microwork, creative crowdsourcing, Crowdsource Workforce Management and inducement prize contests. Although these may not be an exhaustive list, they cover the current major ways in which people use crowds to perform tasks.[42]Crowdvoting[edit]
Crowdvoting occurs when a website gathers a large group's opinions and judgment on a certain topic. The Iowa Electronic Market is a prediction market that gathers crowds' views on politics and tries to ensure accuracy by having participants pay money to buy and sell contracts based on political outcomes.[43]Some of the most famous examples have made use of social media channels: Domino's Pizza, Coca Cola, Heineken and Sam Adams have thus crowdsourced a new pizza, bottle design, beer and song, respectively.[44] Threadless.com selects the T-shirts it sells by having users provide designs and vote on the ones they like, which are then printed and available for purchase.[7]
The California Report Card (CRC), a program jointly launched in January 2014 by the Center for Information Technology Research in the Interest of Society[45] and Lt. Governor Gavin Newsom, is an example of modern-day crowd voting. Participants access the CRC on line and vote on six timely issues. Through principal component analysis, the users are then placed into an online "café" in which they can present their own political opinions and grade the suggestions of other participants. This system aims to effectively involve the greater public in relevant political discussions and highlight the specific topics with which Californians are most concerned.
In 2009, two researchers, founders of Movie Crowd-wisdom Intelligence Strategy, provide the proof-of-concept of crowdvoting's value in the movie industry.[46][47] Their findings show that the crowd can accurately predict the success or failure of a movie based on its trailer. In 2013, a white paper from Google confirms these results.[48]
Crowdsourcing creative work[edit]
Creative crowdsourcing spans sourcing creative projects such as graphic design, crowdsourcing architecture, apparel design, movies,[49] writing, illustration, etc.[50][51][52][edit]
Crowdsourcing has also been used for gathering language-related data. For dictionary work, as was mentioned above, over a hundred years ago it was applied by the Oxford English Dictionary editors, using paper and postage. Much later, a call for collecting examples of proverbs on a specific topic (religious pluralism) was printed in a journal.[53] Today, as "crowdsourcing" has the inherent connotation of being Web-based, such language-related data gathering is being conducted on the Web by crowdsourcing in accelerating ways. Currently, there are a number of dictionary compilation projects being conducted on the Web, particularly for languages that are not highly academically documented, such as for the Oromo language.[54] Software programs have been developed for crowdsourced dictionaries, such as WeSay.[55] A slightly different form of crowdsourcing for language data has been the online creation of scientific and mathematical terminology for American Sign Language.[56] Proverb collection is also being done via crowdsourcing on the Web, most innovatively for the Pashto language of Afghanistan and Pakistan.[57][58][59] Crowdsourcing has been extensively used to collect high-quality gold standard for creating automatic systems in natural language processing (e.g., named entity recognition, entity linking).[60]Crowdsearching[edit]
Chicago-based startup crowdfynd utilizes a version of crowdsourcing best termed as crowdsearching, which differs from Microwork in that there is no obligated payment for taking part in the search.[61] Their platform, through geographic location anchoring, builds a virtual search party of smartphone and internet users to find a lost item, pet or person, as well as returning a found item, pet or property.Crowdfunding[edit]
Main article: Crowdfunding
Crowdfunding is the process of funding your projects by a multitude of people contributing a small amount in order to attain a certain monetary goal, typically via the Internet.[62] Two basic crowdfunding models exist. The model that has been around the longest is rewards-based crowdfunding. This is where people can pre-purchase products, buy experiences, or simply donate. While this funding may in some cases go towards helping a business, funders are not allowed to invest and become shareholders via rewards-based crowdfunding.[63]Individuals, businesses, and entrepreneurs can showcase their businesses and projects to the entire world by creating a profile, typically includes a short video, introducing their project, a list of rewards per donation, and illustrations through images. The idea is to create a compelling message that readers will be drawn towards. Funders make monetary contribution for numerous reasons:
- They connect to the greater purpose of the campaign
- They connect to a physical aspect of the campaign like rewards
- They connect to the creative display of the campaign’s presentation
Crowdfunding sites [63]
- Kickstarter is a new way to fund creative projects. We’re a home for films, games, and music to art, design, and technology.
- Indiegogo is open to almost any kind of project (they even crowdfunded a baby [64]), and the company has a larger international presence than Kickstarter
- Crowdrise is a platform for donating to charitable causes.
- Quirky is a rewards-based crowdfunding platform and online community most often used by product inventors and makers.
- Tilt is a rewards-based crowdfunding platform
Mobile Crowdsourcing[edit]
Mobile crowdsourcing involves crowdsourcing activities that take place on smartphones or mobile platforms, frequently characterized by GPS technology.[65] This allows for real-time data gathering and gives projects greater reach and accessibility. However, mobile crowdsourcing can lead to an urban bias as well as safety and privacy concerns.[66] Some examples of mobile crowdsourcing include TaskRabbit, EasyShift, Gigwalk, and Uber.Macrowork[edit]
Macrowork tasks typically have the following characteristics: they can be done independently; they take a fixed amount of time; and they require special skills. Macrotasks could be part of specialized projects or could be part of a large, visible project where workers pitch in wherever they have the required skills. The key distinguishing factors are that macrowork requires specialized skills and typically takes longer, while microwork requires no specialized skills.Microwork[edit]
Microwork is a crowdsourcing platform where users do small tasks for which computers lack aptitude for low amounts of money. Amazon’s popular Mechanical Turk has created many different projects for users to participate in, where each task requires very little time and offers a very small amount in payment.[3] The Chinese versions of this, commonly called Witkey, are similar and include such sites as Taskcn.com and k68.cn. When choosing tasks, since only certain users “win”, users learn to submit later and pick less popular tasks in order to increase the likelihood of getting their work chosen.[67] An example of a Mechanical Turk project is when users searched satellite images for a boat in order to find lost researcher Jim Gray.[40]Inducement prize contests[edit]
Web-based idea competitions or inducement prize contests often consist of generic ideas, cash prizes, and an Internet-based platform to facilitate easy idea generation and discussion. An example of these competitions includes an event like IBM's 2006 "Innovation Jam", attended by over 140,000 international participants and yielding around 46,000 ideas.[68][69] Another example is the Netflix Prize in 2009. The idea was to ask the crowd to come up with a recommendation algorithm as more accurate than Netflix's own algorithm. It had a grand prize of US$1,000,000, and it was given to the BellKor's Pragmatic Chaos team which bested Netflix's own algorithm for predicting ratings, by 10.06%.Another example of competition-based crowdsourcing is the 2009 DARPA balloon experiment, where DARPA placed 10 balloon markers across the United States and challenged teams to compete to be the first to report the location of all the balloons. A collaboration of efforts was required to complete the challenge quickly and in addition to the competitive motivation of the contest as a whole, the winning team (MIT, in less than nine hours) established its own "collaborapetitive" environment to generate participation in their team.[70] A similar challenge was the Tag Challenge, funded by the US State Department, which required locating and photographing individuals in 5 cities in the US and Europe within 12 hours based only on a single photograph. The winning team managed to locate 3 suspects by mobilizing volunteers world-wide using a similar incentive scheme to the one used in the Balloon Challenge.[71]
Open innovation platforms are a very effective way of crowdsourcing people's thoughts and ideas to do research and development. The company InnoCentive is a crowdsourcing platform for corporate research and development where difficult scientific problems are posted for crowds of solvers to discover the answer and win a cash prize, which can range from $10,000 to $100,000 per challenge.[7] InnoCentive, of Waltham, MA and London, England provides access to millions of scientific and technical experts from around the world. The company claims a success rate of 50% in providing successful solutions to previously unsolved scientific and technical problems. IdeaConnection.com challenges people to come up with new inventions and innovations and Ninesigma.com connects clients with experts in various fields. The X PRIZE Foundation creates and runs incentive competitions offering between $1 million and $30 million for solving challenges. Local Motors is another example of crowdsourcing. A community of 20,000 automotive engineers, designers and enthusiasts competes to build off-road rally trucks.[72]
Implicit crowdsourcing[edit]
Implicit crowdsourcing is less obvious because users do not necessarily know they are contributing, yet can still be very effective in completing certain tasks. Rather than users actively participating in solving a problem or providing information, implicit crowdsourcing involves users doing another task entirely where a third party gains information for another topic based on the user's actions.[7]A good example of implicit crowdsourcing is the ESP game, where users guess what images are and then these labels are used to tag Google images. Another popular use of implicit crowdsourcing is through reCAPTCHA, which asks people to solve CAPTCHAs to prove they are human, and then provides CAPTCHAs from old books that cannot be deciphered by computers, to digitize them for the web. Like many tasks solved using the Mechanical Turk, CAPTCHAs are simple for humans but often very difficult for computers.[40]
Piggyback crowdsourcing can be seen most frequently by websites such as Google that data-mine a user's search history and websites in order to discover keywords for ads, spelling corrections, and finding synonyms. In this way, users are unintentionally helping to modify existing systems, such as Google's AdWords.[73]
Health Care Crowdsourcing[edit]
Research has emerged that outlines the use of Crowdsourcing techniques in the public health domain. The research indicates that collective intelligence outcomes from Crowdsourcing are being generated in three broad categories of public health care; health promotion, health research, and health maintenance.[74]Crowdsourcers[edit]
There are a number of motivations for businesses to use crowdsourcing to accomplish their tasks, find solutions for problems, or to gather information. These include the ability to offload peak demand, access cheap labor and information, generate better results, access a wider array of talent than might be present in one organization, and undertake problems that would have been too difficult to solve internally.[75] Crowdsourcing allows businesses to submit problems on which contributors can work, on topics such as science, manufacturing, biotech, and medicine, with monetary rewards for successful solutions. Although it can be difficult to crowdsource complicated tasks, simple work tasks can be crowdsourced cheaply and effectively.[76]Crowdsourcing also has the potential to be a problem-solving mechanism for government and nonprofit use.[77] Urban and transit planning are prime areas for crowdsourcing. One project to test crowdsourcing's public participation process for transit planning in Salt Lake City was carried out from 2008 to 2009, funded by a U.S. Federal Transit Administration grant.[78] Another notable application of crowdsourcing to government problem solving is the Peer to Patent Community Patent Review project for the U.S. Patent and Trademark Office.[79]
Researchers have used crowdsourcing systems (in particular, the Mechanical Turk) to aid with research projects by crowdsourcing some aspects of the research process, such as data collection, parsing, and evaluation. Notable examples include using the crowd to create speech and language databases,[80][81] and using the crowd to conduct user studies.[73] Crowdsourcing systems provide these researchers with the ability to gather large amount of data. Additionally, using crowdsourcing, researchers can collect data from populations and demographics they may not have had access to locally, but that improve the validity and value of their work.[82]
Artists have also utilized crowdsourcing systems. In his project the Sheep Market, Aaron Koblin used Mechanical Turk to collect 10,000 drawings of sheep from contributors around the world.[83] Sam Brown (artist) leverages the crowd by asking visitors of his website explodingdog to send him sentences that he uses as inspirations for paintings.[84] Art curator Andrea Grover argues that individuals tend to be more open in crowdsourced projects because they are not being physically judged or scrutinized.[38] As with other crowdsourcers, artists use crowdsourcing systems to generate and collect data. The crowd also can be used to provide inspiration and to collect financial support for an artist's work.[85]
Additionally, crowdsourcing from 100 million drivers is being used by INRIX to collect users' driving times to provide better GPS routing and real-time traffic updates.[86]
Demographics[edit]
The crowd is an umbrella term for the people who contribute to crowdsourcing efforts. Though it is sometimes difficult to gather data about the demographics of the crowd, a study by Ross et al. surveyed the demographics of a sample of the more than 400,000 registered crowdworkers using Amazon Mechanical Turk to complete tasks for pay. While a previous study in 2008 by Ipeirotis found that users at that time were primarily American, young, female, and well-educated, with 40% earning more than $40,000 per year. In November 2009, Ross found a very different Mechanical Turk population, 36% of which was Indian. Two-thirds of Indian workers were male, and 66% had at least a Bachelor’s degree. Two-thirds had annual incomes less than $10,000, with 27% sometimes or always depending on income from Mechanical Turk to make ends meet.[87]The average US user of Mechanical Turk earned $2.30 per hour for tasks in 2009, versus $1.58 for the average Indian worker.[citation needed] While the majority of users worked less than five hours per week, 18% worked 15 hours per week or more. This is less than minimum wage in the United States (but not in India), which Ross suggests raises ethical questions for researchers who use crowdsourcing.
The demographics of Microworkers.com differ from Mechanical Turk in that the US and India together account for only 25% of workers. 197 countries are represented among users, with Indonesia (18%) and Bangladesh (17%) contributing the largest share. However, 28% of employers are from the US.[88]
Another study of the demographics of the crowd at iStockphoto found a crowd that was largely white, middle- to upper-class, higher educated, worked in a so-called "white collar job" and had a high-speed Internet connection at home.[89]
Studies have also found that crowds are not simply collections of amateurs or hobbyists. Rather, crowds are often professionally trained in a discipline relevant to a given crowdsourcing task and sometimes hold advanced degrees and many years of experience in the profession.[89][90][91][92] Claiming that crowds are amateurs, rather than professionals, is both factually untrue and may lead to marginalization of crowd labor rights.[93]
G. D. Saxton et al. (2013) studied the role of community users, among other elements, during his content analysis of 103 crowdsourcing organizations. Saxton et al. developed a taxonomy of nine crowdsourcing models (intermediary model, citizen media production, collaborative software development, digital goods sales, product design, peer-to-peer social financing, consumer report model, knowledge base building model, and collaborative science project model) in which to categorize the roles of community users, such as researcher, engineer, programmer, journalist, graphic designer, etc., and the products and services developed.[94]
Motivations[edit]
Contributors[edit]
Researchers found that the most frequently mentioned motives of users participating in crowdsourcing are: (1) money, (2) altruism, (3) fun, (4) reputation/attention, and (5) learning.[95] Many scholars of crowdsourcing suggest that there are both intrinsic and extrinsic motivations that cause people to contribute to crowdsourced tasks and that these factors influence different types of contributors.[89][90][92][96][97][98] For example, students and people employed full-time rate Human Capital Advancement as less important than part-time workers do, while women rate Social Contact as more important than men do.[96]Intrinsic motivations are broken down into two categories: enjoyment-based and community-based motivations. Enjoyment-based motivations refer to motivations related to the fun and enjoyment that the contributor experiences through their participation. These motivations include: skill variety, task identity, task autonomy, direct feedback from the job, and pastime. Community-based motivations refer to motivations related to community participation, and include community identification and social contact.
Extrinsic motivations are broken down into three categories: immediate payoffs, delayed payoffs, and social motivations. Immediate payoffs, through monetary payment, are the immediately received compensations given to those who complete tasks. Delayed payoffs are benefits that can be used to generate future advantages, such as training skills and being noticed by potential employers. Social motivations are the rewards of behaving pro-socially,[99] such as the altruistic motivations of online volunteers. Chandler and Kapelner found that US users of the Amazon Mechanical Turk were more likely to complete a task when told they were going to “help researchers identify tumor cells,” than when they were not told the purpose of their task. However, of those who completed the task, quality of output did not depend on the framing of the task.[96]
Another form of social motivation is prestige or status. The International Children's Digital Library recruits volunteers to translate and review books. Because all translators receive public acknowledgment for their contributions, Kaufman and Schulz cite this as a reputation-based strategy to motivate individuals who want to be associated with institutions that have prestige. The Amazon Mechanical Turk uses reputation as a motivator in a different sense, as a form of quality control. Crowdworkers who frequently complete tasks in ways judged to be inadequate can be denied access to future tasks, providing motivation to produce high-quality work.[100]
Requesters[edit]
Using crowdsourcing through means such as Amazon Mechanical Turk can help provide researchers and requesters with an already established infrastructure for their project allowing them to easily utilize a crowd and access participants from a diverse culture background. Using crowdsourcing can also help complete the work for projects that would normally have geographical and population size limitations.[101]Participation in Crowdsourcing[edit]
Despite the potential global reach of IT applications online, recent research illustrates that differences in location affect participation outcomes in IT-mediated Crowds.[102]Limitations and Controversies[edit]
There are at least five major topics covering limitations and controversies about crowdsourcing:- impact of crowdsourcing on product quality
- entrepreneurs contribute less capital themselves,
- increased number of funded ideas,
- the value and impact of the work received from the crowd, and
- the ethical implications of low wages paid to crowdworkers.
Impact of crowdsourcing on product quality[edit]
Crowdsourcing allows anyone to participate, allowing for many unqualified participants and resulting in large quantities of unusable contributions. Companies, or additional crowdworkers, then have to sort through all of these low quality contributions. This task of sorting through crowdworkers’ contributions, along with the necessary job of managing the crowd, requires companies to hire actual employees, thereby increasing management overhead. [103] For example, there is susceptibility to faulty results caused by targeted, malicious work efforts. Since crowdworkers completing microtasks are paid per task, there is often a financial incentive to complete tasks quickly rather than well. Verifying responses is time-consuming, and so requesters often depend on having multiple workers complete the same task to correct errors. However, having each task completed multiple times increases time and monetary costs.[104]Just as limiting, oftentimes the scenario is that there is just not enough skills or expertise in the crowd to successfully accomplish the desired task. While this scenario does not affect "simple" tasks such as image labeling, this scenario is particularly problematic for tasks that are more complex such as engineering design or product validation. In these cases, it may be difficult or even impossible to find the qualified people in the crowd, as their voices may be drowned out by consistent, but incorrect crowd members.[105] However, if the difficulty of the task is even "intermediate" in its difficultly, it has also been showed that estimating crowdworkers' skills and intentions and leveraging them for inferring true responses works well,[106] albeit with an additional computation cost.
Crowdworkers are a nonrandom sample of the population. Many researchers use crowdsourcing in order to quickly and cheaply conduct studies with larger sample sizes than would be otherwise achievable. However, due to limited access to internet, participation in low developed countries is relatively low. Participation in highly developed countries is similary low, largely because the low amount of pay is not a strong motivation for most users in these countries. These factors lead to a bias in the population pool towards users in medium developed countries, as deemed by the Human Development Index.[107]
The likelihood that a crowdsourced project will fail due to lack of monetary motivation or too few participants increases over the course of the project. Crowdsourcing markets are not a first-in-first-out queue. Tasks that are not completed quickly may be forgotten, buried by filters and search procedures so that workers do not see them. This results in a long tail power law distribution of completion times.[108] Additionally, low-paying research studies online have higher rates of attrition, with participants not completing the study once started.[82] Even when tasks are completed, crowdsourcing does not always produce quality results. When Facebook began its localization program in 2008, it encountered some criticism for the low quality of its crowdsourced translations.[109]
One of the problems of crowdsourcing products is the lack of interaction between the crowd and the client. Usually there is little information about the final desired product, and there is often very limited interaction with the final client. This can decrease the quality of product because client interaction is a vital part of the design process.[110]
An additional cause of the decrease in product quality that can result from crowdsourcing is the lack of collaboration tools. In a typical workplace, coworkers are organized in such a way that they can work together and build upon each other’s knowledge and ideas. Furthermore, the company often provides employees with the necessary information, procedures, and tools to fulfill their responsibilities. However, in crowdsourcing, crowdworkers are left to depend on their own knowledge and means to complete tasks. [103]
It is usually expected from a crowdsourced project to be unbiased by incorporating a large population of participants with a diverse background. However, most of the crowdsourcing works are done by people who are paid or directly benefit from the outcome (e.g. most of open source projects working on Linux). In many other cases, the end product is the outcome of a single person's endeavour, who creates the majority of the product, while the crowd only participates in minor details.[111]
Entrepreneurs Contribute Less Capital Themselves[edit]
To make an idea turn into a reality, the first thing needed is capital. Depending on the scope and complexity of the crowdsourced project, the amount of necessary capital can range from a few thousand dollars to hundreds of thousands. The capital-raising process can take anywhere from days to months depending on different variables including the entrepreneur’s network and the amount of initial self-generated capital.The crowdsourcing process allows entrepreneurs to access to a wide range of investors who can take different stakes in the project.[112] In effect, crowdsourcing simplifies the capital raising process and allows entrepreneurs to spend more time on the project itself and reaching milestones rather than dedicating time to get it started. Overall, the simplified access to capital can save time to start projects and potentially increase efficiency of projects.
Opponents of this issue argue easier access to capital through a large number of smaller investors can hurt the project and its creators. With a simplified capital raising process involving more investors with smaller stakes, investors are more risk seeking due to the fact that they can take on an investment size they are comfortable with.[112] This leads to entrepreneurs losing possible experience convincing investors who are wary of potential risks in investing because they do not depend on one single investor for the survival of their project. Instead of being forced to assess risks and convince large institutional investors why their project can be successful, wary investors can be replaced by others who are willing to take on the risk.
Increased Number of Funded Ideas[edit]
The raw number of ideas that get funded and the quality of the ideas is a large controversy over the issue of crowdsourcing.Proponents argue crowdsourcing is beneficial because it allows niche ideas that would not survive Venture Capitalist or Angel funding, many times the primary investors in startups, to be started. Many ideas are killed in their infancy due to insufficient support and lack of capital but crowdsourcing allows these ideas to be started if an entrepreneur can find a community to take interest in the project.[113]
Crowdsourcing allows those who would benefit from the project to fund and become a part of it which is one way how small niche ideas get started.[114] On the other hand, when the raw number of projects grow, the number of possible failures can also increase. Crowdsourcing assists niche and high-risk projects to start because of a perceived need from a select few who seek the product. With high risk and small target markets, the pool of crowdsourced projects face a greater possible loss of capital, lower return, and lower levels of success.[115]
Ethical concerns for crowdsourcers[edit]
Because crowdworkers are considered independent contractors rather than employees, they are not guaranteed minimum wage. In practice, workers using the Amazon Mechanical Turk generally earn less than the minimum wage, with US users earning an average of $2.30 per hour for tasks in 2009, and users in India earning an average of $1.58 per hour, which is below minimum wage in the United States (but not in India).[87][116] Some researchers who have considered using Mechanical Turk to get participants for research studies have argued that the wage conditions might be unethical.[82][117] However, according to other research, workers on Amazon Mechanical Turk don't feel that they are exploited and are ready to participate in Crowdsourcing activities in the future.[118] When Facebook began its localization program in 2008, it received criticism for using free labor in crowdsourcing the translation of site guidelines.[109]Typically, no written contracts, non-disclosure agreements, or employee agreements are made with crowdworkers. For users of the Amazon Mechanical Turk, this means that requestors decide whether users' work is acceptable, and reserve the right to withhold pay if it does not meet their standards.[101] Critics say that crowdsourcing arrangements exploit individuals in the crowd, and there has been a call for crowds to organize for their labor rights.[93][119]
Collaboration between crowd members can also be difficult or even discouraged, especially in the context of competitive crowd sourcing. Crowdsourcing site InnoCentive allows organizations to solicit solutions to scientific and technological problems; only 10.6% of respondents report working in a team on their submission.[90]
See also[edit]
- Citizen science
- Clickworkers
- Collaborative innovation network
- Collective consciousness
- Collective intelligence
- Commons-based peer production
- Crowd computing
- Crowdcasting
- Crowdfixing
- Crowdsourcing software development
- Distributed thinking
- Flash mob
- Gamification
- Government crowdsourcing
- List of crowdsourcing projects
- Microcredit
- Participatory democracy
- Smart mob
- Social collaboration
- TrueCaller
- Virtual Collective Consciousness
- Virtual volunteering
- Wisdom of the crowd
References[edit]
- Jump up ^ "Crowdsourcing - Definition and More". Merriam-Webster.com. August 31, 2012. Retrieved 2014-02-03.
- ^ Jump up to: a b Estellés-Arolas, Enrique; González-Ladrón-de-Guevara, Fernando (2012), "Towards an Integrated Crowdsourcing Definition" (PDF), Journal of Information Science 38 (2): 189–200, doi:10.1177/0165551512437638
- ^ Jump up to: a b c Howe, Jeff (2006). "The Rise of Crowdsourcing". Wired.
- ^ Jump up to: a b Safire, William (February 5, 2009). "On Language". New York Times Magazine. Retrieved May 19, 2013.
- Jump up ^ Howe, Jeff (June 2, 2006). "Crowdsourcing: A Definition". Crowdsourcing Blog. Retrieved January 2, 2013.
- Jump up ^ "Daren C. Brabham". USC Annenberg. University of Southern California. Retrieved 17 September 2014.
- ^ Jump up to: a b c d e Brabham, Daren (2008), "Crowdsourcing as a Model for Problem Solving: An Introduction and Cases", Convergence: The International Journal of Research into New Media Technologies 14 (1): 75–90, doi:10.1177/1354856507084420, archived from the original (PDF) on 2012-04-25
- Jump up ^ Claypole, Maurice (February 14, 2012). "Learning through crowdsourcing is deaf to the language challenge". The Guardian (London).
- ^ Jump up to: a b c d e f g h Dawson Ross, Getting Results From Crowds (2012), http://www.crowdsourcing.org/editorial/a-brief-history-of-crowdsourcing-infographic/12532
- Jump up ^ "‘C’était Paris en 1970’". etudesphotographiques.revues.org.
- Jump up ^ History of the Online Volunteering service, https://www.onlinevolunteering.org/en/org/about/history.html
- ^ Jump up to: a b Howe Jeff, The Rise of Crowdsourcing (2006), http://archive.wired.com/wired/archive/14.06/crowds.html
- Jump up ^ Lih, Andrew (2009). The Wikipedia revolution : how a bunch of nobodies created the world's greatest encyclopedia (1st ed. ed.). New York: Hyperion. ISBN 1401303714.
- ^ Jump up to: a b "Crowdsourcing Back Up Timeline," http://www.tiki-toki.com/timeline/entry/323158/Crowdsourcing-Back-Up-Timeline-Early-Stories/
- Jump up ^ Winchester, Simon (1999). The Professor and the Madman. New York: HarperPerennial. ISBN 978-0060839789.
- ^ Jump up to: a b Lanxon, Nate (January 31, 2011). "How the Oxford English Dictionary started out like Wikipedia". Retrieved 2012-04-04.
- Jump up ^ Vergano, Dan. "1833 Meteor Storm Started Citizen Science". National Geographic. StarStruck. Retrieved 18 September 2014.
- Jump up ^ NASA's photo organizing project
- Jump up ^ McLaughlin, Elliot. "Image Overload: Help us sort it all out, NASA requests". Cnn.com (CNN). Retrieved 18 September 2014.
- Jump up ^ Aitamurto, Tanja (May 8, 2015). "Crowdsourcing as a Knowledge-Search Method in Digital Journalism: Ruptured Ideals and Blended Responsibility". Digital Journalism. doi:10.1080/21670811.2015.1034807.
- Jump up ^ Aitamurto, Tanja. "Balancing between open and closed: co-creation in magazine journalism". Digital Journalism 1 (2). doi:10.1080/21670811.2012.750150.
- Jump up ^ "History of the Christmas Bird Count | Audubon". birds.audubon.org.
- Jump up ^ [1]
- Jump up ^ "What Is the Four-Generation Program?". The Church of Jesus Christ of Latter-day Saints. Retrieved January 30, 2012.
- Jump up ^ Bonney, R. and LaBranche, M. (2004). Citizen Science: Involving the Public in Research. ASTC Dimensions. May/June 2004, p. 13.
- Jump up ^ Baretto, C.; Fastovsky, D.; Sheehan, P. (2003). "A Model for Integrating the Public into Scientific Research". Journal of Geoscience Education 50 (1): 71–75.
- Jump up ^ McCaffrey, R.E. (2005). "Using Citizen Science in Urban Bird Studies". Urban Habitats 3 (1): 70–86.
- Jump up ^ King, Turi E.; Jobling, Mark A. (2009). "What's in a name? Y chromosomes, surnames and the genetic genealogy revolution". Trends in Genetics 25 (8): 351–60. doi:10.1016/j.tig.2009.06.003. PMID 19665817.
The International Society of Genetic Genealogy (www.isogg.org) advocates the use of genetics as a tool for genealogical research, and provides a support network for genetic genealogists. It hosts the ISOGG Y-haplogroup tree, which has the virtue of being regularly updated.
- Jump up ^ Mendex, etc. al., Fernando (28 February 2013). "An African American Paternal Lineage Adds an Extremely Ancient Root to the Human Y Chromosome Phylogenetic Tree". The American Society of Human Genetics. Retrieved 10 July 2013. This is Volume 92, Issue 3, pp. 454–459.
- Jump up ^ Wells, Spencer (2013). "The Genographic Project and the Rise of Citizen Science". Southern California Genealogical Society (SCGS). Archived from the original on 2013-07-10. Retrieved July 10, 2013.
- Jump up ^ Aitamurto and Landemore. "Five design principles for crowdsourced policymaking: Assessing the case of crowdsourced off-road traffic law reform in Finland". Journal of Social Media for Organizations (1): 1–19.
- Jump up ^ Aitamurto, Tanja. Crowdsourcing for Democracy: New Era in Policymaking. Committee for the Future, Parliament of Finland. ISBN 978-951-53-3459-6.
- Jump up ^ "Antoine-Jean-Baptiste-Robert Auget, Baron de Montyon". New Advent. Retrieved February 25, 2012.
- Jump up ^ "It Was All About Alkali". Chemistry Chronicles. Retrieved February 25, 2012.
- Jump up ^ "Nicolas Appert". John Blamire. Retrieved February 25, 2012.
- Jump up ^ "9 Examples of Crowdsourcing, Before ‘Crowdsourcing’ Existed". MemeBurn. Retrieved February 25, 2012.
- Jump up ^ Pande, Shamni. "The People Know Best". Business Today. India: Living Media India Limited.
- ^ Jump up to: a b DeVun, Leah (November 19, 2009). "Looking at how crowds produce and present art.". Wired News. Archived from the original on 2012-10-24. Retrieved February 26, 2012.
- Jump up ^ Ess, Henk van "Crowdsourcing: how to find a crowd", ARD ZDF Akademie 2010, Berlin, p. 99,
- ^ Jump up to: a b c Doan, A.; Ramarkrishnan, R.; Halevy, A. (2011), "Crowdsourcing Systems on the World Wide Web" (PDF), Communications of the ACM 54 (4): 86–96, doi:10.1145/1924421.1924442
- Jump up ^ Brabham, Daren C. (2013), Crowdsourcing, MIT Press.
- Jump up ^ Howe, Jeff (2008), "Crowdsourcing: Why the Power of the Crowd is Driving the Future of Business" (PDF), The International Achievement Institute.
- Jump up ^ Robson, John (February 24, 2012). "IEM Demonstrates the Political Wisdom of Crowds". Canoe.ca. Retrieved March 31, 2012.
- Jump up ^ "4 Great Examples of Crowdsourcing through Social Media". digitalagencymarketing.com. 2012.
- Jump up ^ Goldberg, Ken; Newsom, Gavin. "Let's amplify California's collective intelligence". Citris-uc.org. Retrieved 14 June 2014.
- Jump up ^ Escoffier, N. and B. McKelvey (2014). "Using "Crowd-Wisdom Strategy" to Co-Create Market Value: Proof-of-Concept from the Movie Industry." in International Perspective on Business Innovation and Disruption in the Creative Industries: Film, Video, Photography, P. Wikstrom and R. DeFillippi, eds., UK: Edward Elgar Publishing Ltd, Chap. 11.
- Jump up ^ Block, A. B. (2009). "How boxoffice trading could flop." The Hollywood Reporter, (April 22).
- Jump up ^ Chen, A. and R. Panaligan (2013). "Quantifying movie magic with Google search." Google White Paper, Industry Perspectives+User Insights
- Jump up ^ Cunard, C. (2010). "The Movie Research Experience gets audiences involved in filmmaking." The Daily Bruin, (July 19)
- Jump up ^ "Compete To Create Your Dream Home". FastCoexist.com. June 4, 2013. Retrieved 2014-02-03.
- Jump up ^ "Designers, clients forge ties on web". Boston Herald. June 11, 2012. Retrieved 2014-02-03.
- Jump up ^ Coleman, Alison (3 December 2014). "Disrupting In Style: Italian Startup Takes Interior Design To The Crowd". Forbes.com.
- Jump up ^ Stan Nussbaum. 2003. Proverbial perspectives on pluralism. Connections: the journal of the WEA Missions Committee October, pp. 30, 31.
- Jump up ^ "Oromo dictionary project". OromoDictionary.com. Retrieved 2014-02-03.
- Jump up ^ "Description of WeSay software and process" (PDF). Retrieved 2014-02-03.
- Jump up ^ "Developing ASL vocabulary for science and math". Washington.edu. December 7, 2012. Retrieved 2014-02-03.
- Jump up ^ "Pashto Proverb Collection project". AfghanProverbs.com. Retrieved 2014-02-03.
- Jump up ^ "Comparing methods of collecting proverbs" (PDF). gial.edu.
- Jump up ^ Edward Zellem. 2014. Mataluna: 151 Afghan Pashto Proverbs. Tampa, FL: Culture Direct.
- Jump up ^ "Web 2.0-based crowdsourcing for high-quality gold standard development in clinical Natural Language Processing". Jmir.org. doi:10.2196/jmir.2426. Retrieved 2014-02-03.
- Jump up ^ Lombard, Amy (May 5, 2013). "Crowdfynd: The First Place to Look". TIME.com. Retrieved 2014-02-03.
- Jump up ^ "What Is Crowdfunding And How Does It Benefit The Economy - Forbes". forbes.com.
- ^ Jump up to: a b "Crowdfunding Sites In 2014 - Forbes". forbes.com.
- Jump up ^ "Help The Haleys Have A Baby! | Indiegogo". indiegogo.com.
- Jump up ^ "Moblie Crowdsourcing". Clickworker. Retrieved 10 December 2014.
- Jump up ^ Thebault-Spieker, Terveen, & Hecht. Avoiding the South Side and the Suburbs: The Geography of Mobile Crowdsourcing Markets.
- Jump up ^ Yang, J.; Adamic, L.; Ackerman, M. (2008), "Crowdsourcing and Knowledge Sharing: Strategic User Behavior on Taskcn" (PDF), Proceedings of the 9th ACM Conference on Electronic Commerce
- Jump up ^ Leimeister, J.M.; Huber, M.; Bretschneider, U.; Krcmar, H. (2009), "Leveraging Crowdsourcing: Activation-Supporting Components for IT-Based Ideas Competition", Journal of Management Information Systems 26 (1): 197–224, doi:10.2753/mis0742-1222260108
- Jump up ^ Ebner, W.; Leimeister, J.; Krcmar, H. (2009), "Community Engineering for Innovations: The Ideas Competition as a method to nurture a Virtual Community for Innovations", R&D Management 39 (4): 342–356, doi:10.1111/j.1467-9310.2009.00564.x
- Jump up ^ "DARPA Network Challenge". DARPA Network Challenge. Retrieved November 28, 2011.
- Jump up ^ "Social media web snares 'criminals'". New Scientist. Retrieved April 4, 2012.
- Jump up ^ "Beyond XPrize: The 10 Best Crowdsourcing Tools and Technologies". February 20, 2012. Retrieved March 30, 2012.
- ^ Jump up to: a b Kittur, A.; Chi, E.H.; Sun, B. (2008), "Crowdsourcing user studies with Mechanical Turk" (PDF), CHI 2008
- Jump up ^ Prpić, J., (2015). Health Care Crowds: Collective Intelligence in Public Health. Collective Intelligence 2015. Center for the Study of Complex Systems, University of Michigan.|url=http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2570593%7Caccessdate= March 7, 2015
- Jump up ^ Noveck, Beth Simone (2009), Wiki Government: How Technology Can Make Government Better, Democracy Stronger, and Citizens More Powerful, Brookings Institution Press
- Jump up ^ Sarasua, Cristina; Simperl, Elena; Noy, Natalya F. (2012), "Crowdsourcing Ontology Alignment with Microtasks" (PDF), Institute AIFB. Karlsruhe Institute of Technology: 2
- Jump up ^ "Crowdfunding and Civic Society in Europe: A Profitable Partnership?". Open Citizenship Journal. Retrieved April 29, 2013.
- Jump up ^ Federal Transit Administration Public Transportation Participation Pilot Program, U.S. Department of Transportation
- Jump up ^ Peer-to-Patent Community Patent Review Project, Peer-to-Patent Community Patent Review Project
- Jump up ^ Callison-Burch, C.; Dredze, M. (2010), "Creating Speech and Language Data With Amazon’s Mechanical Turk" (PDF), Human Language Technologies Conference: 1–12
- Jump up ^ McGraw, I.; Seneff, S. (2011), "Growing a Spoken Language Interface on Amazon Mechanical Turk" (PDF), Interspeech: 3057–3060
- ^ Jump up to: a b c Mason, W.; Suri, S. (2010), "Conducting Behavioral Research on Amazon’s Mechanical Turk", Behavior Research Methods
- Jump up ^ Koblin, A. (2008), "The sheep market", Creativity and Cognition
- Jump up ^ Explodingdog
- Jump up ^ Linver, D. (2010), Crowdsourcing and the Evolving Relationship between Art and Artist
- Jump up ^ "Why Inrix - Inrix". inrix.com.
- ^ Jump up to: a b Ross, J.; Irani, L.; Silberman, M.S.; Zaldivar, A.; Tomlinson, B. (2010). "Who are the Crowdworkers? Shifting Demographics in Mechanical Turk" (PDF). CHI 2010.
- Jump up ^ Hirth, M.; Hoßfeld, T.; Train-Gia, P. (2011), Human Cloud as Emerging Internet Application – Anatomy of the Microworkers Crowdsourcing Platform (PDF)
- ^ Jump up to: a b c Brabham, Daren C. (2008). "Moving the Crowd at iStockphoto: The Composition of the Crowd and Motivations for Participation in a Crowdsourcing Application". First Monday.
- ^ Jump up to: a b c Lakhani et al. (2007). "The Value of Openness in Scientific Problem Solving" (PDF). Retrieved February 26, 2012.
- Jump up ^ Brabham, Daren C. (2012). "Managing Unexpected Publics Online: The Challenge of Targeting Specific Groups with the Wide-Reaching Tool of the Internet". International Journal of Communication.
- ^ Jump up to: a b Brabham, Daren C. (2010). "Moving the Crowd at Threadless: Motivations for Participation in a Crowdsourcing Application". Information, Communication & Society 13: 1122–1145. doi:10.1080/13691181003624090.
- ^ Jump up to: a b Brabham, Daren C. (2012). "The Myth of Amateur Crowds: A Critical Discourse Analysis of Crowdsourcing Coverage". Information, Communication & Society 15: 394–410. doi:10.1080/1369118X.2011.641991.
- Jump up ^ Saxton, Oh, & Kishore (2013). "Rules of Crowdsourcing: Models, Issues, and Systems of Control.". Information Systems Management 30: 2–20. doi:10.1080/10580530.2013.739883.
- Jump up ^ Buettner, R. (2015). A Systematic Literature Review of Crowdsourcing Research from a Human Resource Management Perspective. 48th Annual Hawaii International Conference on System Sciences. Kauai, Hawaii: IEEE. pp. 4609–4618. ISBN 978-1-4799-7367-5.
- ^ Jump up to: a b c Kaufmann, N.; Schulze, T.; Viet, D. (2011). "More than fun and money. Worker Motivation in Crowdsourcing – A Study on Mechanical Turk" (PDF). Proceedings of the Seventeenth Americas Conference on Information Systems.
- Jump up ^ Brabham, Daren C. (2012). "Motivations for Participation in a Crowdsourcing Application to Improve Public Engagement in Transit Planning". Journal of Applied Communication Research 40: 307–328. doi:10.1080/00909882.2012.693940.
- Jump up ^ Lietsala, Katri; Joutsen, Atte (2007). "Hang-a-rounds and True Believers: A Case Analysis of the Roles and Motivational Factors of the Star Wreck Fans". MindTrek 2007 Conference Proceedings.
- Jump up ^ State of the World’s Volunteerism Report 2011 http://www.unv.org/fileadmin/docdb/pdf/2011/SWVR/English/SWVR2011_full.pdf
- Jump up ^ Quinn; Bederson (2010). "Human Computation: A Survey and Taxonomy of a Growing Field" (PDF). CHI 2011.
- ^ Jump up to: a b Paolacci, G; Chandler, J; Ipeirotis, P.G. (2010). "Running experiments on Amazon Mechanical Turk". Judgment and Decision Making 5 (5): 411–419.
- Jump up ^ Prpić, J; Shukla, P.; Roth, Y.; Lemoine, J.F. (2015). "A Geography of Participation in IT-Mediated Crowds". Proceedings of the Hawaii International Conference on Systems Sciences 2015. Retrieved November 8, 2014.
- ^ Jump up to: a b Borst, Irma. "The Case For and Against Crowdsourcing: Part 2". Retrieved 2015-02-09.
- Jump up ^ Ipeirotis; Provost; Wang (2010). "Quality Management on Amazon Mechanical Turk" (PDF).
- Jump up ^ Burnap, Alex; Ren, Alex J.; Papazoglou, Giannis; Gerth, Richard; Gonzalez, Richard; Papalambros, Panos. ""When Crowdsourcing Fails: A Study of Expertise on Crowdsourced Design Evaluation"" (PDF).
- Jump up ^ Kurve, Aditya; Miller, David J.; Kesidis, George (30 May 2014). "Multicategory Crowdsourcing Accounting for Variable Task Difficulty, Worker Skill, and Worker Intention". IEEE KDE (99).
- Jump up ^ Hirth; Hoßfeld; Tran-Gia (2011), Human Cloud as Emerging Internet Application - Anatomy of the Microworkers Crowdsourcing Platform (PDF)
- Jump up ^ Ipeirotis (2010). "Analyzing the Amazon Mechanical Turk Marketplace" (PDF). XRDS: Crossroads, The ACM Magazine for Students - Comp-YOU-Ter (ACM) 17 (2). doi:10.1145/1870000/1869094. Retrieved February 26, 2012.
- ^ Jump up to: a b Hosaka, Tomoko A. (April 2008). "Facebook asks users to translate for free". MSNBC.
- Jump up ^ Britt, Darice. "Crowdsourcing: The Debate Roars On". Retrieved 2012-12-04.
- Jump up ^ Woods, Dan (28 September 2009). "The Myth of Crowdsourcing". Forbes. Retrieved 2012-12-04.
- ^ Jump up to: a b Aitamurto, Tanja (2011) "The Promise of Idea Crowdsourcing – Benefits, Contexts, Limitations." Url=http://www.academia.edu/963662/The_Promise_of_Idea_Crowdsourcing_Benefits_Contexts_Limitations).
- Jump up ^ Kleeman, Frank (2008). "Un(der)paid Innovators: The Commercial Utilization of Consumer Work through Crowdsourcing".
- Jump up ^ Jason (2011). "Crowdsourcing: A Million Heads is Better Than One".
- Jump up ^ Dupree, Steven (2014). "Crowdfunding 101: Pros and Cons".
- Jump up ^ "Fair Labor Standards Act Advisor". Retrieved 28 February 2012.
- Jump up ^ Norcie (2011). "Ethical and Practical Considerations for Compensation of Crowdsourced Research Participants".
- Jump up ^ Busarovs, Aleksejs (2013). "Ethical Aspects of Crowdsourcing, or is it a Modern Form of Exploitation." (PDF). International Journal of Economics & Business Administration 1 (1): 3–14. Retrieved 26 November 2014.
- Jump up ^ The Crowdsourcing Scam (Dec. 2014), The Baffler, No. 26
Further Reading[edit]
How a Lone Hacker Shredded the Myth of CrowdsourcingExternal links[edit]
Wikibooks has more on the topic of: Crowdsourcing |
No comments:
Post a Comment