Jump to content

User:Brndn.js/sandbox

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by Brndn.js (talk | contribs) at 15:36, 17 September 2014 (Dumped info from previous article). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.
  1. Make "Types of Crowdsourcing" a L1 header and strike the "Modern methods" area.
  2. Move "Predecessors" to "History" in line with the definition above. Because crowdsourcing is not necessarily online, these options are not "predecessors" to crowdsourcing so much as historical, analog methods.
  3. Take out "Wisdom of the crowd" and "Crowdsearching" -- these do not have sufficient information to justify their existence. There seem to be singular examples that are too similar to macrowork to justify their existence.

http://www.amberalert.gov/about.htm

Modern methods

Today, crowdsourcing has transferred mainly to the Internet. The Internet provides a particularly good venue for crowdsourcing since individuals tend to be more open in web-based projects where they are not being physically judged or scrutinized and thus can feel more comfortable sharing. This ultimately allows for well-designed artistic projects because individuals are less conscious, or maybe even less aware, of scrutiny towards their work. In an online atmosphere, more attention can be given to the specific needs of a project, rather than spending as much time in communication with other individuals.[1]

Crowdsourcing can either take an explicit or an implicit route. Explicit crowdsourcing lets users work together to evaluate, share and build different specific tasks, while implicit crowdsourcing means that users solve a problem as a side effect of something else they are doing.

With explicit crowdsourcing, users can evaluate particular items like books or webpages, or share by posting products or items. Users can also build artifacts by providing information and editing other people's work.

Implicit crowdsourcing can take two forms: standalone and piggyback. Standalone allows people to solve problems as a side effect of the task they are actually doing, whereas piggyback takes users' information from a third-party website to gather information.[2]

Types of crowdsourcing

There are some common categories of crowdsourcing that can be used effectively in the commercial world. Some of these web-based crowdsourcing efforts include crowdvoting, wisdom of the crowd, crowdfunding, microwork, creative crowdsourcing, Crowdsource Workforce Management and inducement prize contests. Although these may not be an exhaustive list, they cover the current major ways in which people use crowds to perform tasks.[3]

According to a definition by Henk van Ess:[4]

"The crowdsourced problem can be huge (epic tasks like finding alien life or mapping earthquake zones) or very small ('where can I skate safely?'). Some examples of successful crowdsourcing themes are problems that bug people, things that make people feel good about themselves, projects that tap into niche knowledge of proud experts, subjects that people find sympathetic or any form of injustice."

Crowdsourcing typology

In his 2013 book, Crowdsourcing, Daren C. Brabham puts forth a problem-based typology of crowdsourcing approaches:[5]

  • Knowledge Discovery & Management - for information management problems where an organization mobilizes a crowd to find and assemble information. Ideal for creating collective resources.
  • Distributed Human Intelligence Tasking - for information management problems where an organization has a set of information in hand and mobilizes a crowd to process or analyze the information. Ideal for processing large data sets that computers cannot easily do.
  • Broadcast Search - for ideation problems where an organization mobilizes a crowd to come up with a solution to a problem that has an objective, provable right answer. Ideal for scientific problem solving.
  • Peer-Vetted Creative Production - for ideation problems where an organization mobilizes a crowd to come up with a solution to a problem which has an answer that is subjective or dependent on public support. Ideal for design, aesthetic, or policy problems.

Crowdvoting

Crowdvoting occurs when a website gathers a large group's opinions and judgment on a certain topic. The Iowa Electronic Market is a prediction market that gathers crowds' views on politics and tries to ensure accuracy by having participants pay money to buy and sell contracts based on political outcomes.[6]

Some of the most famous examples have made use of social media channels: Domino's Pizza, Coca Cola, Heineken and Sam Adams have thus crowdsourced a new pizza, bottle design, beer or song, respectively.[7] Threadless.com selects the t-shirts it sells by having users provide designs and vote on the ones they like, which are then printed and available for purchase.[8]

The California Report Card (CRC), a program jointly launched in January 2014 by the Center for Information Technology Research in the Interest of Society[9] and Lt. Governor Gavin Newsom, is a strong example of modern-day crowd voting. Participants access the CRC online and vote on six timely issues. Through Principle Component Analysis, the users are then placed into an online "cafe" in which they can present their own political opinions and grade the suggestions of other participants. This system aims to effectively involve the greater public in relevant political discussions and highlight the specific topics with which Californians are most concerned.

Crowdsourcing creative work

Creative crowdsourcing spans sourcing creative projects such as graphic design, crowdsourcing architecture, apparel design, writing, illustration, etc.[10][11]

Crowdsourcing has also been used for gathering language-related data. For dictionary work, as was mentioned above, over a hundred years ago it was applied by the Oxford English Dictionary editors, using paper and postage. Much later, a call for collecting examples of proverbs on a specific topic (religious pluralism) was printed in a journal.[12] Today, as "crowdsourcing" has the inherent connotation of being Web-based, such language-related data gathering is being conducted on the Web by crowdsourcing in accelerating ways. Currently, there are a number of dictionary compilation projects being conducted on the Web, particularly for languages that are not highly academically documented, such as for the Oromo language.[13] Software programs have been developed for crowdsourced dictionaries, such as WeSay.[14] A slightly different form of crowdsourcing for language data has been the online creation of scientific and mathematical terminology for American Sign Language.[15] Proverb collection is also being done via crowdsourcing on the Web, most innovatively for the Pashto language of Afghanistan and Pakistan.[16] Crowdsourcing has been extensively used to collect high-quality gold standard for creating automatic systems in natural language processing (e.g., named entity recognition, entity linking).[17]

Crowdsearching

Chicago-based startup crowdfynd utilizes a version of crowdsourcing best termed as crowdsearching, which differs from Microwork in that there is no obligated payment for taking part in the search.[18] Their platform, through geographic location anchoring, builds a virtual search party of smartphone and internet users to find a lost item, pet or person, as well as return a found item, pet or property.

Crowdfunding

Crowdfunding is the process of funding your projects by a multitude of people contributing a small amount in order to attain a certain monetary goal.[19] Goals may be for donations or for equity in a project. The dilemma for equity crowdfunding in the US as of 2012 was how the SEC is going to regulate the entire process. At the time, rules and regulations were being refined by the SEC, which had until Jan. 1st, 2013 to tweak the fundraising methods. The regulators were overwhelmed trying to regulate Dodd – Frank and all the other rules and regulations involving public companies and the way they trade. Advocates of regulation claimed that crowdfunding would open up the flood gates for fraud, called it the "wild west" of fundraising, and compared it to the 1980s days of penny stock "cold-call cowboys." The process allows for up to 1 million dollars to be raised without a lot of the regulations being involved. Companies under the then-current proposal would have a lot of exemptions available and be able to raise capital from a larger pool of persons which can include a lot lower thresholds for investor criteria whereas the old rules required that the person be an "accredited" investor. These people are often recruited from social networks, where the funds can be acquired from an equity purchase, loan, donation, or pre-ordering. The amounts collected have become quite high, with requests that are over a million dollars for software like Trampoline Systems, which used it to finance the commercialization of their new software.[20]

A well-known crowdfunding website is Kickstarter, a website for funding creative projects. It has raised over $100 million, despite its all-or-nothing model which requires one to reach the proposed monetary goal in order to acquire the money. Crowdrise brings together volunteers to fundraise in an online environment.[21] In a related project, Offbeatr was launched in 2012 to crowdfund pornography.[22] Citizinvestor is a US crowdfunding platform specifically focused on raising money for public projects and community infrastructure.[23]

"Wisdom of the crowd"

iStockPhoto provides a platform for people to upload photos and purchase them for low prices. Clients can purchase photos through credits, giving photographers a small profit. Again, the photo collection is determined by the crowd's decisions made for very low prices.[8]

In February 2012, a stock picking game called Ticker Picker Pro was launched, using crowdsourcing to create a hedge fund that would buy and sell stocks based on the ideas coming out of the game. It was hoped that these crowdsourced ideas, coming from so many people, could help one pick the best stocks based on this idea that collective ideas are better than individual ones.[24]

Macrowork

Macrowork tasks typically have the following characteristics: they can be done independently; they take a fixed amount of time; and that they require special skills. Macrotasks could be part of specialized projects or could be part of a large, visible project where workers pitch in wherever they have the required skills. The key distinguishing factors are that macrowork requires specialized skills and typically takes longer, while microwork requires no specialized skills.

Microwork

Microwork is a crowdsourcing platform where users do small tasks for which computers lack aptitude for low amounts of money. Amazon’s popular Mechanical Turk has created many different projects for users to participate in, where each task requires very little time and offers a very small amount in payment.[25] The Chinese versions of this, commonly called Witkey, are similar and include such sites as Taskcn.com and k68.cn. When choosing tasks, since only certain users “win”, users learn to submit later and pick less popular tasks in order to increase the likelihood of getting their work chosen.[26] An example of a Mechanical Turk project is when users searched satellite images for a boat in order to find lost researcher Jim Gray.[2]

Inducement prize contests

Web-based idea competitions or inducement prize contests often consist of generic ideas, cash prizes, and an Internet-based platform to facilitate easy idea generation and discussion. An example of these competitions includes an event like IBM's 2006 "Innovation Jam", attended by over 140,000 international participants and yielding around 46,000 ideas.[27][28] Another example is the Netflix Prize in 2009. The idea was to ask the crowd to come up with a recommendation algorithm as more accurate than Netflix's own algorithm. It had a grand prize of US$1,000,000, and it was given to the BellKor's Pragmatic Chaos team which bested Netflix's own algorithm for predicting ratings, by 10.06%.

Another example of competition-based crowdsourcing is the 2009 DARPA balloon experiment, where DARPA placed 10 balloon markers across the United States and challenged teams to compete to be the first to report the location of all the balloons. A collaboration of efforts was required to complete the challenge quickly and in addition to the competitive motivation of the contest as a whole, the winning team (MIT, in less than nine hours) established its own "collaborapetitive" environment to generate participation in their team.[29] A similar challenge was the Tag Challenge, funded by the US State Department, which required locating and photographing individuals in 5 cities in the US and Europe within 12 hours based only on a single photograph. The winning team managed to locate 3 suspects by mobilizing volunteers world-wide using a similar incentive scheme to the one used in the Balloon Challenge.[30]

Open innovation platforms are a very effective way of crowdsourcing people's thoughts and ideas to do research and development. The company InnoCentive is a crowdsourcing platform for corporate research and development where difficult scientific problems are posted for crowds of solvers to discover the answer and win a cash prize, which can range from $10,000 to $100,000 per challenge.[8] InnoCentive, of Waltham, MA and London, England is the leader in providing access to millions of scientific and technical experts from around the world. The company has provided expert crowdsourcing to international Fortune 1000 companies in the US and Europe as well as government agencies and nonprofits. The company claims a success rate of 50% in providing successful solutions to previously unsolved scientific and technical problems. IdeaConnection.com challenges people to come up with new inventions and innovations and Ninesigma.com connects clients with experts in various fields. The X PRIZE Foundation creates and runs incentive competitions where one can win between $1 million and $30 million for solving challenges. Local Motors is another example of crowdsourcing. A community of 20,000 automotive engineers, designers and enthusiasts competes to build offroad rally trucks.[21]

Implicit crowdsourcing

Implicit crowdsourcing is less obvious because users do not necessarily know they are contributing, yet can still be very effective in completing certain tasks. Rather than users actively participating in solving a problem or providing information, implicit crowdsourcing involves users doing another task entirely where a third party gains information for another topic based on the user's actions.[8]

A good example of implicit crowdsourcing is the ESP game, where users guess what images are and then these labels are used to tag Google images. Another popular use of implicit crowdsourcing is through reCAPTCHA, which asks people to solve CAPTCHAs to prove they are human, and then provides CAPTCHAs from old books that cannot be deciphered by computers, to digitize them for the web. Like many tasks solved using the Mechanical Turk, CAPTCHAs are simple for humans but often very difficult for computers.[2]

Piggyback crowdsourcing can be seen most frequently by websites such as Google that data-mine a user's search history and websites in order to discover keywords for ads, spelling corrections, and finding synonyms. In this way, users are unintentionally helping to modify existing systems, such as Google's AdWords.[31]

  1. ^ DeVun, Leah (November 19, 2009). "Looking at how crowds produce and present art". Wired News. Retrieved February 26, 2012.
  2. ^ a b c Doan, A.; Ramarkrishnan, R.; Halevy, A. (2011), "Crowdsourcing Systems on the World Wide Web" (PDF), Communications of the ACM, 54 (4): 86–96, doi:10.1145/1924421.1924442 {{citation}}: Invalid |display-authors=3 (help)
  3. ^ Howe, Jeff (2008), "Crowdsourcing: Why the Power of the Crowd is Driving the Future of Business" (PDF), The International Achievement Institute.
  4. ^ Ess, Henk van "Crowdsourcing: how to find a crowd", ARD ZDF Akademie 2010, Berlin, p. 99,
  5. ^ Brabham, Daren C. (2013), Crowdsourcing, MIT Press.
  6. ^ Robson, John (February 24, 2012). "IEM Demonstrates the Political Wisdom of Crowds". Canoe.ca. Retrieved March 31, 2012.
  7. ^ "4 Great Examples of Crowdsourcing through Social Media". digitalagencymarketing.com. 2012.
  8. ^ a b c d Brabham, Daren (2008), "Crowdsourcing as a Model for Problem Solving: An Introduction and Cases" (PDF), Convergence: The International Journal of Research into New Media Technologies, 14 (1): 75–90, doi:10.1177/1354856507084420
  9. ^ Goldberg, Ken; Newsom, Gavin. "Let's amplify California's collective intelligence". Citris-uc.org. Retrieved 14 June 2014.
  10. ^ "Compete To Create Your Dream Home". FastCoexist.com. June 4, 2013. Retrieved 2014-02-03.
  11. ^ "Designers, clients forge ties on web". Boston Herald. June 11, 2012. Retrieved 2014-02-03.
  12. ^ Stan Nussbaum. 2003. Proverbial perspectives on pluralism. Connections: the journal of the WEA Missions Committee October, pp. 30, 31.
  13. ^ "Oromo dictionary project". OromoDictionary.com. Retrieved 2014-02-03.
  14. ^ "Description of WeSay software and process" (PDF). Retrieved 2014-02-03.
  15. ^ "Developing ASL vocabulary for science and math". Washington.edu. December 7, 2012. Retrieved 2014-02-03.
  16. ^ "Pashto Proverb Collection project". AfghanProverbs.com. Retrieved 2014-02-03.
  17. ^ "Web 2.0-based crowdsourcing for high-quality gold standard development in clinical Natural Language Processing". Jmir.org. doi:10.2196/jmir.2426. Retrieved 2014-02-03.{{cite web}}: CS1 maint: unflagged free DOI (link)
  18. ^ Lombard, Amy (May 5, 2013). "Crowdfynd: The First Place to Look". TIME.com. Retrieved 2014-02-03.
  19. ^ Prive, Tanya (November 27, 2012). "What Is Crowdfunding And How Does It Benefit The Economy". Forbes. Retrieved November 27, 2012.
  20. ^ Belleflame, Paul (2011), "Crowdfunding: Tapping the Right Crowd", Core Discussion Paper
  21. ^ a b "Beyond XPrize: The 10 Best Crowdsourcing Tools and Technologies". February 20, 2012. Retrieved March 30, 2012.
  22. ^ Grandoni, Dino (August 16, 2012). "Offbeatr - A Kickstarter for Porn". The Huffington Post.
  23. ^ Morelli, Keith (23 September 2012). "Kayak commute dream is a citizen-investor idea". Tampa Tribune. Retrieved 28 February 2014.
  24. ^ Rulison, Larry (February 14, 2012). "A Winning App? They Hope So". Times Union (Albany).
  25. ^ Cite error: The named reference wired2006 was invoked but never defined (see the help page).
  26. ^ Yang, J.; Adamic, L.; Ackerman, M. (2008), "Crowdsourcing and Knowledge Sharing: Strategic User Behavior on Taskcn" (PDF), Proceedings of the 9th ACM Conference on Electronic Commerce {{citation}}: Invalid |display-authors=3 (help)
  27. ^ Leimeister, J.M.; Huber, M.; Bretschneider, U.; Krcmar, H. (2009), "Leveraging Crowdsourcing: Activation-Supporting Components for IT-Based Ideas Competition", Journal of Management Information Systems, 26 (1): 197–224, doi:10.2753/mis0742-1222260108 {{citation}}: Invalid |display-authors=4 (help)
  28. ^ Ebner, W.; Leimeister, J.; Krcmar, H. (2009), "Community Engineering for Innovations: The Ideas Competition as a method to nurture a Virtual Community for Innovations", R&D Management, 39 (4): 342–356, doi:10.1111/j.1467-9310.2009.00564.x {{citation}}: Invalid |display-authors=3 (help)
  29. ^ "DARPA Network Challenge". DARPA Network Challenge. Retrieved November 28, 2011.
  30. ^ "Social media web snares 'criminals'". New Scientist. Retrieved April 4, 2012.
  31. ^ Kittur, A.; Chi, E.H.; Sun, B. (2008), "Crowdsourcing user studies with Mechanical Turk" (PDF), CHI 2008 {{citation}}: Invalid |display-authors=3 (help)