请输入您要查询的百科知识:

 

词条 Amazon Mechanical Turk
释义

  1. Overview

      Location of Turkers  

  2. History

  3. Description

      Users    Other aspects  

  4. Uses

      Applications    Missing persons searches    Social science experiments    Artistic and educational research    Third-party programming    API    Use case examples    Processing photos / videos    Data cleaning / verification    Information collection    Data processing  

  5. Research validity

  6. Labor issues

      Monetary compensation    Fraud    Labor relations  

  7. Related systems

  8. See also

  9. References

  10. Further reading

  11. External links

{{Update|inaccurate=yes|date=July 2015}}{{Infobox website
| logo =
| alexa = 5,330 ({{as of|2016|09|05|alt=September 2016}})[1]
| url = {{URL|www.mturk.com}}
| current status = Live
}}Amazon Mechanical Turk (MTurk) is a crowdsourcing Internet marketplace enabling individuals and businesses (known as Requesters) to coordinate the use of human intelligence to perform tasks that computers are currently unable to do. It is one of the sites of Amazon Web Services, and is owned by Amazon.[2] Employers are able to post jobs known as Human Intelligence Tasks (HITs), such as choosing the best among several photographs of a storefront, writing product descriptions, or identifying performers on music CDs. Workers, colloquially known as Turkers, can then browse among existing jobs and complete them in exchange for a monetary payment set by the employer. To place jobs, the requesting programs use an open application programming interface (API), or the more limited MTurk Requester site.[3] To submit a request for tasks to be completed through the Amazon Mechanical Turk web site, a requester must provide a billing address in one of around 30 approved countries.[4]

Overview

The name Mechanical Turk was inspired by "The Turk", an 18th-century chess-playing automaton made by Wolfgang von Kempelen that toured Europe, beating both Napoleon Bonaparte and Benjamin Franklin. It was later revealed that this "machine" was not an automaton at all, but was, in fact, a human chess master hidden in the cabinet beneath the board and controlling the movements of a humanoid dummy. Likewise, the Mechanical Turk online service allows humans to help the machines of today perform tasks for which they are not suited.

Workers set their own hours and are not under any obligation to accept any particular task. Because workers are paid as contractors rather than employees, they don't have to file forms or pay payroll taxes, and they avoid laws stipulating conditions regarding minimum wage, overtime, and workers compensation. However, they must report their income as self-employment income. The average wage for the multiple microtasks assigned, if performed quickly, is about one dollar an hour, with each task averaging a few cents.[5]

Requesters can ask that Workers fulfill qualifications before engaging in a task, and they can set up a test in order to verify the qualification. They can also accept or reject the result sent by the Worker, which affects the Worker's reputation. Workers can have a postal address anywhere in the world. Payment for completing tasks can be redeemed on Amazon.com via gift certificate (gift certificates are the only payment option available to international workers, apart from India) or be later transferred to a Worker's U.S. bank account. Requesters pay Amazon a 20% commission on the price of successfully completed jobs.[6]

Location of Turkers

According to a survey conducted in 2008 through one MTurk HIT, Workers are primarily located in the United States[7] with demographics generally similar to the overall Internet population in the US.[8]

The same author carried out a second survey in 2010 (after the introduction of cash payments for Indian workers), which gave new and updated results on the demographics of workers.[9] He currently runs a website showing worker demographics that is updated hourly. It shows that approximately 80% of workers are located in the United States and 20% are located elsewhere in the world, most of whom are in India.[10]

A more recent study reports Worker demographics on over 30,000 Workers across 75 studies that have been conducted since 2013.[11]

History

The service was initially conceived by Venky Harinarayan.[12]

MTurk was launched publicly on November 2, 2005. Following its launch, the Mechanical Turk user base grew quickly. In early- to mid-November 2005, there were tens of thousands of jobs, all of them uploaded to the system by Amazon itself for some of its internal tasks that required human intelligence. HIT types have expanded to include transcribing, rating, image tagging, surveys, and writing.

In March 2007, there were reportedly more than 100,000 workers in over 100 countries.[6] This increased to over 500,000 registered workers from over 190 countries in January 2011.[13] In the same year, Techlist published an interactive map pinpointing the locations of 50,000 of their MTurk workers around the world.[14]

Description

Users

A user of Mechanical Turks can be either a "Worker" (contractor) or a "Requester" (employer).

Employees have access to a dashboard that displays three sections: total earnings, HIT status and HIT totals.

  • Total earnings: displays the total earnings a worker has received from the realization of human intelligence tasks, the gains made from bonuses and the sum of these two.
  • HIT status: displays a list of daily activities and the daily income, along with the number of visits that were submitted, approved, rejected or waiting for the given day.
  • HIT totals: displays information about HIT which have been accepted or are in process (including the percentage of successes that occurred, returned or abandoned and the percentage of jobs that were approved, rejected or pending those presented).

Employers (companies or independent developers that need jobs performed) can use the Amazon Mechanical Turk API to programmatically integrate the results of that work directly into their business processes and systems. When employers set up their job, they must specify

  • how much are they paying for each HIT accomplished,
  • how many workers they want to work on each HIT,
  • maximum time a worker has to work on a single task,
  • how much time the workers have to complete the work,

as well as the specific details about the job they want to be completed.

Other aspects

Crowd labor

Amazon Mechanical Turk provides access to a crowd-sourced market of workers that can help to complete work on an as-needed basis. For work that does not require significant task-specific training, this can contrast with the traditional costs of hiring and management of temporary staff. For users, it also allows them to select among a variety of different tasks.

Quality management

Amazon Mechanical Turk allows more than one user to send a response to the same HIT. When a specific number of users give the same answer, the HIT is automatically approved. All data for HITs is available for viewing as soon as it is submitted, allowing Requesters to manually assess quality. Requesters are not required to accept a worker's results if they are deemed inadequate, which may lead to frustration between parties. If the result is not adequate, the job is rejected and the Requester is not required to pay.

Price determination

Users are free to work on tasks that they find most interesting, those they like to complete or the best paid. Requesters are allowed to define the payments based on the desired balance of performance and cost-efficiency. Payments are made in cooperation with Amazon Payments.

User qualification

Amazon Mechanical Turk allows for qualifying users before they work in their tasks using rapid tests. The qualifications can be a series of questions, performing tasks or request users to have historically responded to a minimum percentage of their HIT sent correctly. Such preliminary tests can be used by companies to get data for questions without paying anything.

Data for Machine Learning

Supervised Machine Learning algorithms require large amounts of human-annotated data to be trained successfully. Researchers performed such annotation by hiring students, resulting in high turn-over because of the extreme repetitiveness and tedium of such tasks. By dividing the annotation task across a large number of workers, Mturk offers a means to acquire it cost-effectively, while minimizing tedium by keeping individual workers' tasks suitably small.

Uses

Applications

Missing persons searches

Since 2007, the service has been used to search for prominent missing individuals. It was first suggested during the search for James Kim, but his body was found before any technical progress was made. That summer, computer scientist Jim Gray disappeared on his yacht and Amazon's Werner Vogels, a personal friend, made arrangements for DigitalGlobe, which provides satellite data for Google Maps and Google Earth, to put recent photography of the Farallon Islands on Mechanical Turk. A front-page story on Digg attracted 12,000 searchers who worked with imaging professionals on the same data. The search was unsuccessful.[15]

In September 2007, a similar arrangement was repeated in the search for aviator Steve Fossett. Satellite data was divided into 85 squared meter sections, and Mechanical Turk users were asked to flag images with "foreign objects" that might be a crash site or other evidence that should be examined more closely.[16] This search was also unsuccessful. The satellite imagery was mostly within a 50-mile radius,[17] but the crash site was eventually found by hikers about a year later, 65 miles away.[18]

Social science experiments

Beginning in 2010, numerous researchers have explored the viability of Mechanical Turk to recruit subjects of social science experiments. Thousands of papers that rely on data collected from Mechanical Turk workers are published each year, including hundreds in top ranked academic journals.[19] Researchers generally found that while samples of respondents obtained through Mechanical Turk do not perfectly match all relevant characteristics of the U.S. population, they're not wildly misrepresentative either.[20][21] The general consensus among researchers is that the service works best for recruiting a diverse sample; it is less successful with studies that require more precisely defined populations or that require a representative sample of the population as a whole.[19][22][23][24][25][26]

Overall, the U.S. MTurk population is mostly female and white, and is somewhat younger and more educated than the U.S. population overall. Data collected on jobs conducted since 2013 show that the U.S. population is no longer predominantly female, and that Workers are currently slightly more likely to be male.[11] The cost of MTurk was considerably lower than other means of conducting surveys, with workers willing to complete tasks for less than half the U.S. minimum wage.[27]

Artistic and educational research

In addition to receiving growing interest from the social sciences, MTurk has also been used as a tool for both artistic and educational exploration. Artist Aaron Koblin has made use of MTurk's crowdsourcing ability to create a number of collaborative artistic works such as The Sheep Market and Ten Thousand Cents[28] which combined thousands of individual drawings of a US$100 bill.[29] The work functions as a sort of reverse exquisite corpse drawing.

Inspired by Koblin's collaborative artworks, a Concordia University graduate research student turned to MTurk to see if the crowdsourcing technology could also be used for educational research. Scott McMaster conducted two pilot projects which used HITs to request drawings, but, in contrast to Koblin's work, the workers knew exactly what the drawings were being used for. The jobs required participants to visually represent sets of words in drawings and fill out a short demographic survey. Although the research was in its infancy, McMaster's findings suggested that a globalizing effect is emerging within visual cultural representations. It is a published instance of this type of online research into visual culture.[30]

Third-party programming

Programmers have developed various browser extensions and scripts designed to simplify the process of completing jobs. According to the Amazon Web Services Blog, however, Amazon appears to disapprove of the ones that completely automate the process and preclude the human element, because of the concern that the task completion process - e.g. answering a survey - could be gamed with random responses, and the resultant collected data could be worthless.[31] Accounts using so-called automated bots have been banned. There are services that extend the capabilities to MTurk.

API

Amazon makes available an application programming interface (API) to give users another access point into the MTurk system. The MTurk API lets a programmer access numerous aspects of MTurk like submit jobs, retrieve completed work, and approve or reject that work.[32] In 2017, Amazon launched support for AWS Software Development Kits (SDK), allowing for nine new SDKs available to MTurk Users. MTurk is accessible via API from the following languages: Python, JavaScript, Java, .NET, Go, Ruby, PHP or C++.[33] Web sites and web services can use the API to integrate MTurk work into other web applications, providing users with alternatives to the interface Amazon has built for these functions.

Use case examples

Processing photos / videos

Amazon Mechanical Turk provides a platform for processing images, a task well-suited to human intelligence. Requesters have created tasks asking workers to label objects found in an image, select the most relevant picture in a group of pictures, screen inappropriate content, and classify objects in satellite images. Also, crowd workers have completed tasks of digitizing text from images such as scanned forms filled out by hand.[34]

Data cleaning / verification

Companies with large online catalogs use Mechanical Turk to identify duplicates and verify details of item entries. Some examples of fixing duplicates are identifying and removing duplicates in yellow pages directory listings and online product catalog entries. Examples of verifying details include checking restaurant details (e.g. phone number and hours) and finding contact information from web pages (e.g. author name and email).[5][34]

Information collection

Diversification and scale of personnel of Mechanical Turk allow collecting an amount of information that would be difficult outside of a crowd platform. Mechanical Turk allows requesters to amass a large number of responses to various types of surveys, from basic demographics to academic research. Other uses include writing comments, descriptions and blog entries to websites and searching data elements or specific fields in large government and legal documents.[34]

Data processing

Companies use Mechanical Turk's crowd labor to understand and respond to different types of data. Common uses include editing and transcription of podcasts, translation, and matching search engine results.[5][34]

Research validity

The validity of research conducted with the Mechanical Turk worker pool has been questioned.[35][36] This is in large part due to the proprietary method that Mechanical Turk uses to select its workers.[37] Since the method of selection is not shared with researchers, researchers can not know the true demographics of the pool of participants. It is unclear if Mechanical Turk uses fiscal, political, or educational limiters in their selection process. This may invalidate any surveys or research done using the Mechanical Turk worker pool.[38][39]

Labor issues

Monetary compensation

Because tasks are typically simple and repetitive and users are paid often only a few cents to complete them, some have criticized Mechanical Turk for exploiting and not compensating workers for the true value of the task they complete.[40] The minimum payment that Amazon allows for a task is one cent. The market for tasks is competitive and for some these tasks are their only available form of employment, particularly for the less educated. Because of the need to provide for themselves and a lack of other opportunities, many workers accept the low compensation for the completion of tasks. A study of 3.8 million tasks completed by 2,767 workers on Amazon's Mechanical Turk showed that “workers earned a median hourly wage of about $2 an hour” with 4 percent of workers earning more than $7.25 per hour. Since these workers are considered independent contractors, they are not protected by the Fair Labor Standards Act that guarantees minimum wage. As workers search for tasks, they do not receive compensation nor do they receive additional compensation if a task takes longer than estimated by the requester.[41] Computer scientist Jaron Lanier notes how the design of Mechanical Turk "allows you to think of the people as software components" that conjures "a sense of magic, as if you can just pluck results out of the cloud at an incredibly low cost".[42] On the other hand, in one psychological study done by the University of Texas, evidence showed that many of the workers did not complete the task for monetary compensation, and instead did the work for enjoyment and self-fulfillment.[24]

Fraud

The Nation magazine said in 2014 that some requesters had taken advantage of workers by having them do the tasks, then rejecting their submissions in order to avoid paying them.[43]

In the Facebook–Cambridge Analytica data scandal, Mechanical Turk was one of the means of covertly gathering private information for a massive database.[44] The system paid persons a dollar or two to install a Facebook connected app and answer personal questions. The survey task, as a work for hire, was not used for a demographic or psychological research project as it might have seemed. The purpose was instead to bait the worker to reveal personal information about the worker's identity that was not already collected by Facebook or Mechanical Turk.

Labor relations

Others have criticized that the marketplace does not have the ability for the workers to negotiate with the employers. In response to the growing criticisms of payment evasion and lack of representation, a group has developed a third party platform called Turkopticon which allows workers to give feedback on their employers allowing other users to avoid potentially shady jobs and to recommend superior employers.[45][46] Another platform called Dynamo was created to allow the workers to collect anonymously and organize campaigns to better their work environment, including the Guidelines for Academic Requesters and the Dear Jeff Bezos Campaign.[47][48][49][50] Amazon has made it harder for workers to enroll in Dynamo by closing the request account that provided workers with a required code for Dynamo membership. Amazon has installed updates that prevent plugins that identify high quality human intelligence tasks from functioning on the website.[41]

Related systems

{{further|Crowdsourcing}}

Amazon coined the term artificial artificial intelligence for processes outsourcing some parts of a computer program to humans, for those tasks carried out much faster by humans than computers. Jeff Bezos was responsible for the concept that led to Amazon's Mechanical Turk being developed to realize this process.[51]

MTurk is comparable in some respects to the now discontinued Google Answers service. However, the Mechanical Turk is a more general marketplace that can potentially help distribute any kind of work tasks all over the world. The Collaborative Human Interpreter (CHI) by Philipp Lenssen also suggested using distributed human intelligence to help computer programs perform tasks that computers cannot do well. MTurk could be used as the execution engine for the CHI.

The Russian search giant Yandex has developed a similar system called Yandex.Toloka. [52]

See also

  • CAPTCHA, which challenges and verifies human work at a simple online task
  • Citizen science
  • Microwork

References

1. ^{{cite web|url= http://www.alexa.com/siteinfo/mturk.com |title= mturk.com Site Info | publisher= Alexa Internet |accessdate= 2016-09-07 }}
2. ^{{cite web|url=https://www.mturk.com/mturk/help?helpPage=overview|accessdate=14 April 2017|title=Amazon Mechanical Turk, FAQ page}}
3. ^{{cite web|url=http://requester.mturk.com |title=Overview | Requester | Amazon Mechanical Turk |publisher=Requester.mturk.com |date= |accessdate=2011-11-28}}
4. ^{{cite web|url=https://requester.mturk.com/help/faq#can_requesters_outside_us_use_mturk|title=FAQs – Help – Requester – Amazon Mechanical Turk}}
5. ^"Amazon Mechanical Turk: The Digital Sweatshop" Ellen Cushing Utne Reader January–February 2013
6. ^{{Cite web|url = http://aws.amazon.com/mturk/pricing/|title = Mturk pricing|date = 2014|accessdate = 21 July 2014|website = AWS|publisher = Amazon}}
7. ^{{cite news|url= http://behind-the-enemy-lines.blogspot.com/2008/03/mechanical-turk-demographics.html |title=Mechanical Turk: The Demographics|author=Panos Ipeirotis|date=March 19, 2008|publisher=New York University|accessdate=2009-07-30}}
8. ^{{cite news|url=http://behind-the-enemy-lines.blogspot.com/2009/03/turker-demographics-vs-internet.html|title=Turker Demographics vs Internet Demographics|author=Panos Ipeirotis|date=March 16, 2009|publisher=New York University|accessdate=2009-07-30}}
9. ^{{cite news|url= http://www.behind-the-enemy-lines.com/2010/03/new-demographics-of-mechanical-turk.html|title=The New Demographics of Mechanical Turk|author=Panos Ipeirotis|date=March 9, 2010|publisher=New York University|accessdate=2014-03-24}}
10. ^{{cite web|title=MTurk Tracker|url=http://demographics.mturk-tracker.com/|website=demographics.mturk-tracker.com|accessdate=1 October 2015}}
11. ^{{cite web|title=The New New Demographics on Mechanical Turk: Is there Still a Gender Gap?|url=http://blog.turkprime.com/2015/03/the-new-new-demographics-on-mechanical.html|website=TurkPrime.com|accessdate=12 March 2015}}
12. ^{{Cite web|url = http://patft.uspto.gov/netacgi/nph-Parser?Sect1=PTO1&Sect2=HITOFF&d=PALL&p=1&u=%2Fnetahtml%2FPTO%2Fsrchnum.htm&r=1&f=G&l=50&s1=7,197,459.PN.&OS=PN/7,197,459&RS=PN/7,197,459/|title = Hybrid machine/human computing arrangement|date = 2001|accessdate = 28 July 2016}}
13. ^{{cite web|title=AWS Developer Forums|url=https://forums.aws.amazon.com/thread.jspa?threadID=58891|accessdate=14 November 2012}}
14. ^{{cite web|last=Tamir|first=Dahn|title=50000 Worldwide Mechanical Turk Workers|url=http://techlist.com/mturk/global-mturk-worker-map.php|publisher=techlist|accessdate=September 17, 2014}}
15. ^{{cite news|url=https://www.wired.com/techbiz/people/magazine/15-08/ff_jimgray?currentPage=5|title=Inside the High-Tech Search for a Silicon Valley Legend|author=Steve Silberman|date=July 24, 2007|publisher=Wired magazine|accessdate=2007-09-16}}
16. ^{{cite web|url=http://www.avweb.com/avwebflash/news/SteveFossettSearch_AmazonMechanicalTurk_PleaseHelp_196097-1.html |title=AVweb Invites You to Join the Search for Steve Fossett |publisher=Avweb.com |date= |accessdate=2011-11-28}}
17. ^{{cite web|url=http://s3.amazonaws.com/fossett/index.html|title=Official Mechanical Turk Steve Fossett Results|date=2007-09-24|accessdate=2012-08-14}}
18. ^{{cite news|url=https://www.reuters.com/article/peopleNews/idUSTRE4907G820081001|title=Hikers find Steve Fossett's ID, belongings|author=Jim Christie|date=October 1, 2008|publisher=Reuters|accessdate=2008-11-27| archiveurl= https://web.archive.org/web/20081220030716/https://www.reuters.com/article/peopleNews/idUSTRE4907G820081001| archivedate= 20 December 2008| deadurl= no}}
19. ^{{cite journal | last1 = Chandler | first1 = Jesse. | last2 = Shapiro | first2 = Danielle | year = 2016 | title = Conducting Clinical Research Using Crowdsourced Convenience Samples | journal = Annual Review of Clinical Psychology | volume = 12 | pages = 53–81 | url = https://www.mathematica-mpr.com/our-publications-and-findings/publications/conducting-clinical-research-using-crowdsourced-convenience-samples | doi=10.1146/annurev-clinpsy-021815-093623| pmid = 26772208 }}
20. ^{{cite journal | last1 = Casey | first1 = Logan | last2 = Chandler | first2 = Jesse | last3 = Levine | first3 = Adam | last4 = Proctor | first4 = Andrew| last5 = Sytolovich| first5 = Dara| year = 2017 | title = Intertemporal Differences Among MTurk Workers: Time-Based Sample Variations and Implications for Online Data Collection | journal = SAGE Open | volume = 7 | issue = 2 | pages = 215824401771277 | doi = 10.1177/2158244017712774 }}
21. ^{{cite journal | last1 = Levay | first1 = Kevin | last2 = Freese | first2 = Jeremy | last3 = Druckman | first3 = James| year = 2016 | title = The Demographic and Political Composition of Mechanical Turk Samples | journal = SAGE Open | volume = 6| pages = 215824401663643 | doi = 10.1177/2158244016636433 }}
22. ^{{cite news | url=http://journalistsresource.org/studies/economics/commerce/online-labor-markets-research-amazon-mechanical-turk/ | title=Evaluating Online Labor Markets for Experimental Research: Amazon.com's Mechanical Turk}} JournalistsResource.org, retrieved June 18, 2012
23. ^{{cite journal | last1 = Paolacci | first1 = Gabriele. | last2 = Chandler | first2 = Jesse | last3 = Ipeirotis | first3 = Panos | year = 2010 | title = Running Experiments on Amazon Mechanical Turk | journal = Judgment and Decision Making | url = http://www.sjdm.org/~baron/journal/10/10630a/jdm10630a.pdf}}
24. ^{{cite journal | last1 = Buhrmester | first1 = Michael | last2 = Kwang | first2 = Tracy | last3 = Gosling | first3 = Sam | year = 2011 | title = Amazon's Mechanical Turk A New Source of Inexpensive, Yet High-Quality, Data? | journal = Perspectives on Psychological Science | volume = 6 | issue = 1 | pages = 3–5 | url = http://pps.sagepub.com/content/6/1/3.short | doi = 10.1177/1745691610393980| pmid = 26162106 }}
25. ^{{cite journal | last1 = Berinsky | first1 = Adam J. | last2 = Huber | first2 = Gregory A. | last3 = Lenz | first3 = Gabriel S. | year = 2012 | title = Evaluating Online Labor Markets for Experimental Research: Amazon.com's Mechanical Turk | journal = Political Analysis | volume = 20 | issue = 3 | pages = 351 | url = http://pan.oxfordjournals.org/content/early/2012/03/02/pan.mpr057.abstract | doi = 10.1093/pan/mpr057}}
26. ^{{cite journal | last1 = Horton | first1 = John J. | last2 = Rand | first2 = David G. | last3 = Zeckhauser | first3 = Richard J. | year = 2010 | title = The Online Laboratory: Conducting Experiments in a Real Labor Market | journal = Experimental Economics | volume = 14 | issue = 3 | pages = 399 | doi = 10.1007/s10683-011-9273-9| arxiv = 1004.2931 | citeseerx = 10.1.1.763.3358 }}
27. ^{{cite book | last1 = Horton | first1 = John | last2 = Chilton | first2 = Lydia | year = 2010 | title = The Labor Economics of Paid Crowdsourcing | journal = Proceedings of the 11th ACM Conference on Electronic Commerce | pages = 209 | arxiv = 1001.0627| doi = 10.1145/1807342.1807376| isbn = 978-1-60558-822-3 }}
28. ^Ten Thousand Cents – Project: http://www.tenthousandcents.com/top.html
29. ^Koblin, Aaron: {{cite web |url=http://www.aaronkoblin.com/work.html |title=Archived copy |accessdate=2013-03-20 |deadurl=yes |archiveurl=https://web.archive.org/web/20130116144056/http://www.aaronkoblin.com/work.html |archivedate=2013-01-16 |df= }}
30. ^McMaster, S. (2012). New Approaches to Image-based Research and Visual Literacy. In Avgerinou, Chandler, Search and Terzic (Eds.), New Horizons in Visual Literacy: Selected Readings of the International Visual Literacy Association (122-132). Siauliai, Lithuania: SMC Scientia Educologica: https://concordia.academia.edu/SCOTTMCMASTER
31. ^{{cite web|url=http://aws.typepad.com/aws/2005/12/amazon_mechanic.html |title=Amazon Web Services Blog: Amazon Mechanical Turk Status Update |publisher=Aws.typepad.com |date=2005-12-06 |accessdate=2011-11-28}}
32. ^{{cite web |url=http://developer.amazonwebservices.com/connect/kbcategory.jspa?categoryID=28 |title=Documentation Archive : Amazon Web Services |publisher=Developer.amazonwebservices.com |date= |accessdate=2011-11-28 |deadurl=yes |archiveurl=https://web.archive.org/web/20090410032147/http://developer.amazonwebservices.com/connect/kbcategory.jspa?categoryID=28 |archivedate=2009-04-10 |df= }}
33. ^{{cite web|url=http://docs.aws.amazon.com/AWSMechTurk/latest/AWSMturkAPI/Welcome.html | title=Amazon Mechanical Turk API Reference |publisher=Developer.amazonwebservices.com }}
34. ^"Inside Amazon's clickworker platform: How half a million people are being paid pennies to train AI" Tech Republic
35. ^{{cite web|title=Running Experiments with Amazon Mechanical Turk|url=http://mgto.org/running-experiments-with-amazon-mechanical-turk/}}
36. ^{{cite journal|title=Can I Use Mechanical Turk (MTurk) for a Research Study?|journal=Industrial and Organizational Psychology|volume=8|issue=2|url=http://neoacademic.com/2014/11/13/can-use-mechanical-turk-mturk-research-study/|year=2015|last1=Landers|first1=R. N.|last2=Behrend|first2=T. S.}}
37. ^{{cite web|title=www.trustpilot.com|url=https://www.trustpilot.com/reviews/58c5b860a912c407fce890bb}}
38. ^{{cite web|title=External Validity|url=https://explorable.com/external-validity}}
39. ^{{cite web|title=External Validity|url=https://www.socialresearchmethods.net/kb/external.php}}
40. ^{{cite book |doi=10.1109/CGC.2013.89|title=The Good, The Bad and the Ugly: Why Crowdsourcing Needs Ethics|pages=531–535|year=2013|last1=Schmidt|first1=Florian Alexander|isbn=978-0-7695-5114-2|journal= Cloud and Green Computing (CGC), 2013 Third International Conference on}}
41. ^{{Cite web|url=https://www.theatlantic.com/business/archive/2018/01/amazon-mechanical-turk/551192/|title=The Internet is Enabling a New Kind of Poorly Paid Hell|last=Semuels|first=Alana|date=January 23, 2018|website=The Atlantic|access-date=May 14, 2018}}
42. ^{{cite book|author=Jaron Lanier|title=Who Owns the Future? |year=2013|publisher=Simon and Schuster|isbn=978-1-4516-5497-4|title-link=Who Owns the Future? }}
43. ^Moshe Z. Marvit, "How Crowdworkers Became the Ghosts in the Digital Machine," The Nation, February 24, 2014, screen 4
44. ^{{cite news|url=https://www.nytimes.com/2018/04/10/magazine/cambridge-analytica-and-the-coming-data-bust.html?|author=New York Times|title=Cambridge Analytica and the Coming Data Bust|date=April 10, 2018|accessdate=April 13, 2018|newspaper=The New York Times}}
45. ^{{cite magazine |url= https://www.newscientist.com/article/mg21729036.200-crowdsourcing-grows-up-as-online-workers-unite.html#.VV9rTE9Viko |title='Crowdsourcing grows up as online workers unite' |author=Hal Hodson |date=February 7, 2013 |publisher=New Scientist |accessdate=May 21, 2015}}
46. ^{{cite web|url=https://turkopticon.ucsd.edu/|title=turkopticon's add-on}}
47. ^{{cite magazine |url= https://www.theguardian.com/technology/2014/dec/03/amazon-mechanical-turk-workers-protest-jeff-bezos |title='Amazon's Mechanical Turk workers protest: 'I am a human being, not an algorithm |author=Mark Harris |date=December 3, 2014 |publisher=The Guardian |accessdate=October 6, 2015}}
48. ^{{cite magazine |url= https://www.engadget.com/2014/12/03/amazon-mechanical-turk-workers-ask-for-respect/ |title='Amazon's Mechanical Turk workers want to be treated like humans' |author=Jon Fingas |date=December 3, 2014 |publisher=Engadget |accessdate=October 6, 2015}}
49. ^{{cite magazine |url= https://www.theverge.com/2014/12/4/7331777/amazon-mechanical-turk-workforce-digital-labor |title='Amazon's Mechanical Turkers want to be recognized as 'actual human beings
|author=James Vincent |date=December 4, 2014 |publisher=The Verge |accessdate=October 6, 2015}}
50. ^{{cite magazine |url= http://www.fastcompany.com/3042081/what-does-a-union-look-like-in-the-gig-economy |title='WHAT DOES A UNION LOOK LIKE IN THE GIG ECONOMY?' |author=Sarah Kessler |date=February 19, 2015 |publisher=Fast Company |accessdate=October 6, 2015}}
51. ^{{cite news|url=http://www.economist.com/node/7001738?story_id=7001738|title=Artificial artificial intelligence|publisher=The Economist | date=2006-06-10}}
52. ^https://toloka.yandex.com/

Further reading

  • Business Week article on Mechanical Turk by Rob Hof, November 4, 2005.
  • [https://www.wired.com/wired/archive/14.06/crowds.html Wired Magazine] story about "Crowdsourcing," June 2006.
  • Salon.com article on Mechanical Turk by Katharine Mieszkowski, July 24, 2006.
  • [https://www.nytimes.com/2007/03/25/business/yourmoney/25Stream.html?ex=1332475200&en=cd1ce5d0bee647d5&ei=5088&partner=rssnyt&emc=rss New York Times article on Mechanical Turk] by Jason Pontin, March 25, 2007.
  • Technology Review article on Mechanical Turk, "How Mechanical Turk is Broken," by Christopher Mims, January 3, 2010.
  • {{citation |work=The Atlantic |date=June 8, 2015 |url= https://www.theatlantic.com/technology/archive/2015/06/the-tragedy-of-the-digital-commons/395129/ |author=J. Nathan Matias |title=Tragedy of the Digital Commons }} (discusses labor relations)

External links

  • {{Official website|www.mturk.com}}
  • Requester Best Practices Guide, Updated February 2015.
  • {{cite web |url=http://ir.ischool.utexas.edu/crowd/#mturk |title=Amazon Mechanical Turk |work=Crowdsourcing News, Events, and Resources |editor=Matt Lease |via= University of Texas at Austin School of Information |location=USA}}
{{Cloud computing}}{{Amazon}}

5 : Amazon (company)|Crowdsourcing|Human-based computation|Social information processing|Web services

随便看

 

开放百科全书收录14589846条英语、德语、日语等多语种百科知识,基本涵盖了大多数领域的百科知识,是一部内容自由、开放的电子版国际百科全书。

 

Copyright © 2023 OENC.NET All Rights Reserved
京ICP备2021023879号 更新时间:2024/11/11 0:39:02