Delivery giant DoorDash is expanding its business model beyond food and groceries. The company has launched “Tasks,” a new app that pays its drivers – rebranded as “Taskers” – to perform simple, everyday actions for the purpose of training artificial intelligence (AI) models. This move, following a similar initiative by Uber, raises questions about the future of gig work, data privacy, and the potential displacement of workers by the very AI they’re helping to build.
The Nature of the Work
The Tasks app assigns small, often mundane activities that provide visual and conversational data for AI training. These include filming daily routines (like washing dishes or making a bed), recording oneself speaking in another language, and even scanning store shelves. DoorDash claims this data helps AI systems “understand the physical world,” a critical step in developing more sophisticated machine learning capabilities.
Payments vary based on the perceived effort and complexity of the task: $16 for scanning shelves, $20 for a staged “spontaneous” Spanish conversation, and more for tasks involving cooking. The ambiguity in pay rates adds another layer of uncertainty for gig workers, who already face unpredictable income streams.
Why This Matters: AI’s Thirst for Data
Modern AI algorithms rely on vast datasets to learn and improve. Companies are now directly leveraging the gig economy to acquire this data cheaply and efficiently. The tasks DoorDash assigns are precisely the kind of real-world footage and audio that AI models need to refine their understanding of human actions, environments, and even cultural nuances.
This trend highlights the growing demand for “ground truth” data – the accurate, labeled information that machine learning depends on. As AI expands into robotics, retail automation, and other industries, the need for this type of training data will only increase.
Privacy Concerns and the Automation Question
DoorDash asserts it maintains “robust privacy safeguards,” but offers no specifics. The fact that Taskers must avoid “political content” and “identifiable information” suggests the company is aware of potential ethical and legal risks.
A more fundamental question looms: what happens when these AI models are fully trained? Will they be used to automate jobs currently held by humans, including those of the very workers providing the training data? The industry’s silence on this matter is troubling, as it suggests a deliberate disconnect between the present exploitation of gig workers and the future automation of their roles.
Limited Availability, Broad Implications
The Tasks app is currently rolled out in select areas of the US, with DoorDash partnering with businesses across retail, insurance, hospitality, and tech for training. While the rollout is limited now, the model has clear scaling potential. The company is effectively turning its driver network into a data-collection pipeline, monetizing human activity for the benefit of AI development.
This experiment is not just about DoorDash; it’s a sign of a broader shift. The gig economy, already known for its precarious labor conditions, is now being weaponized to feed the insatiable hunger of AI systems.
