Ingenious Crowdsourcing or Hidden Risks? Decoding Amazon’s Mechanical Turk

Author: Suraj Sunkara, Graphics: Nina Tagliabue

The BRB Bottomline

 While Amazon Mechanical Turk seems like a reliable and easy way to make a quick buck and collect data, concerns about worker compensation and the reliability of the data it collects questions its effectiveness. Continue reading to learn more!


Throughout my life, I’ve had an obsession with random ways to make a quick buck. Whether that be scanning my parents’ grocery receipts on Shopkick or checking daily for a new Google Opinion Rewards survey, I always try to find the least time-consuming method to make some money while sitting on my bed. One website I’ve come across over the past year is Amazon Mechanical Turk, or MTurk for short. MTurk crowdsources surveys and other small tasks to respondents, allowing task requesters to gain a vast amount of information for a small price. Owned by Amazon (as clear through the name), MTurk is definitely one of the more trustworthy sites I’ve used over the years. However, reliability aside, Amazon Mechanical Turk is not without its fair share of negative sentiment. Throughout this article, we will explore more about MTurks actual workings and whether it deserves some of the contempt it receives.

What even is Amazon Mechanical Turk?

The name “Mechanical Turk” comes from an 18th century chess-playing device that challengers thought was automated. Unbeknownst to them, the device was actually controlled by a human inside the machine (check this out to learn more about the story).  Jeff Bezos, the founder of Amazon described MTurk as “artificial intelligence.” Essentially, humans complete tasks on MTurk that machines can’t easily do, and in exchange they are rewarded with a small sum of money. The premise of MTurk falls under the bounds of today’s “gig economy.” In the same way Uber and Lyft connect people who drive to people who need to be driven around, MTurk connects firms that need data with people willing to provide it. In the next few sections, we will take a closer look at the individuals on both sides and what is being transferred.

HITs and Turkers

The tasks that individuals complete on Amazon Mechanical Turk are called HITs, a shortened form of “Human Intelligence Tasks.” HITs generally fall into a few different categories, notably transcription, surveys, content classification or matching, and collecting information from pictures. Tasks are mainly composed of actions that a computer would take a good deal of power to complete. In my MTurk experience, I’ve run into HITs with CAPTCHAs, voice recordings, and tediously long surveys. Having machines do these tasks would not give an output the requesters would be interested in. The workers that complete these HITs are known as “Turkers.” To become a Turker, you must be registered on Amazon. If you complete the tasks that you see and that are accessible to you, you get compensated accordingly. Sometimes, depending on the quality of your work, you can even receive a small bonus on your work.

Who Requests HITs?

In a world fueled by data, it’s pretty understandable as to why companies would line up to obtain vast amounts of data both efficiently and legally. The requesters of these HITs tend to fall into one of two categories: academics and businesses. On the academic side, you have researchers and scientists looking for participants/data for scientific studies. The business side is largely dominated by a single firm who produces real-time sales data. Requesters post the tasks they want completed along with a compensation number that they give to individuals who do them. Amazon, obviously, takes a surcharge on any HIT posted. Requesters can limit who can complete HITs based on either more individual factors (like geographic location) or something more general, like experience with MTurk measured through total completed HITs. Once an individual completes a HIT, the requester can deny giving compensation if they feel that the individual’s work was not up to standard. For example, if someone fills out a survey with troll responses, they likely will not be compensated.

Issues with MTurk

Bias in Data Collection

On the requester side, there are a lot of concerns as to the validity of the data collected through MTurk due to bias in the data collection process. Sampling bias, for example, is a serious issue with MTurk. While MTurk workers are more diverse than the university student samples that a lot of researchers usually end up with, they still aren’t representative of Americans or working adults as a whole. As responses aren’t indicative of what the general public might say, using them in any scientific study/business decision might have negative repercussions.  

Another issue is non-naivete where about 5.7% of workers complete about 40% of all research HITs posted on MTurk. These people are labeled as “superworkers” who either spend a significant portion of their day on MTurk or use tools to actively grab decently compensating HITs when they appear. These individuals may respond differently from more novice peers when it comes to completing various HITs. For example, if you’ve completed hundreds of surveys, you might just end up answering new surveys as quick as possible with as little effort as possible to still get compensated.

Other forms of bias are also present in MTurk’s data collection process. Coming up with methods to alleviate the effects of this bias is of utmost importance for requesters.

Unfair Compensation for Turkers

From an outside perspective, Amazon Mechanical Turk seems like a glorious way to make a few dollars for very little work. However, MTurk is not as glamorous as it seems; in fact, it’s pretty difficult to really get your money’s worth. Most HITs listed on the website are locked for novice Turkers like myself. You either need approval from the requester or must have completed a certain amount of HITs to be qualified to complete these HITs. Most of the HITs I can currently access give a total compensation of $0.01. No, that’s not a typo – that’s one singular penny. For that compensation to be up to par with CA minimum wage which is set at $15.00, the HIT would need to take less than 2.4 seconds. This brings up comparisons to a digital sweatshop especially as a non-insignificant portion of Turkers are low-income workers looking to supplement their incomes. An academic study from 2018 showed that the median hourly wage for a Turker was ~$2/hour

Take-Home Points

If you are incredibly bored and have some free-time on your hands, MTurk could be an interesting way to make a quick buck. While you won’t be compensated incredibly fairly, it is a way to make money while sitting on your bed. The tasks you have to do are all pretty menial such as surveys, transcription, and so on. Next time you find yourself with an hour to spare, consider checking MTurk out! 

Leave a Reply

Your email address will not be published. Required fields are marked *