The first-ever manager of the newly formed Department of Data and Technology in your city worries obsessively about making the wrong decisions when it comes to buying data and technology. You work in this department. While the organization hasn’t made major mistakes yet, the manager is always reading case studies of how other communities have made decisions that have put vulnerable residents at risk, or decisions that have made it more difficult for departments to operate transparently, efficiently, independently, and in-line with their community’s values.
Your manager is hoping to build a culture where people in the department feel comfortable asking critical questions about technology. They have developed this workshop, which shares some of the most misfortunate use cases that could be ripped from the headlines.
In this workshop, you and your peers will develop questions that could have been asked before harmful data and technologies were purchased. Your director was inspired to draft these scenarios from some of the most-problematic use cases as a guide in what not to do. Through this training, your director hopes that your organization will avoid many of the mistakes that others have already made.
A vendor contracted with a neighboring City to purchase AI technology that was used to create a dataset of public infrastructure conditions using imagery. A trusted local human rights organization points out that the company relies on frequently exploited workers from Venezuela to tag and process images that are instrumental to the function of the algorithm. These workers typically act as independent contractors, are usually paid very low wages, and lack the power to object when unscrupulous contractors invent ways to not pay these workers for their labor. The human rights organization has asked your mayor to rescind the contract, but doing so may trigger a lawsuit with the company.
What questions could have been asked and what rules and practices could have been made before the contract was awarded that would have prevented this government from buying a product that exploits extremely vulnerable people?