What is data literacy and why does it matter?
We will be holding our second round of Data Literacy for Data Stewards training workshops this spring. This series emphasizes the importance of thinking critically about creating, applying, and managing data and technology. In these workshops, participants will learn to ask questions about data and technology, reflect on how their actions and decisions impact people and communities, and identify how data and technology are used to reinforce systems of oppression. They’ll also develop strategies to institutionalize the values of equity, fairness, and justice into the data and technology practices and infrastructures of their organizations. Our twin pandemics of COVID and structural racism have highlighted the need for data practitioners to expose and dismantle technologies and practices that reinforce and amplify powerlessness, discrimination, and injustice. Increasing data literacy is just one of the ways to protect members of our community. Here are just a few examples of harms caused by data and technology in recent years:Who are we looking for?
This series is designed to bring together people whose lives and communities are impacted by data, along with everyone that has a role in the data life cycle. People involved in the data life cycle include those that invest in, design, and manage data systems, data producers, data stewards, and people that use data to make decisions that impact communities. We’re seeking participants from community and nonprofit organizations, local government agencies, offices of elected officials, colleges and universities, local foundations, and civic institutions. Thanks to support from a local foundation, there is no cost to participate.What will we cover?
We believe that “learning by doing” is a great way to build skills and confidence in data literacy and explore issues of power and privilege in data. Workshops will be structured around a series of interactive, participatory activities where attendees will work together in both small and large groups. Topics covered in this series of peer-learning activities will cover:What does it mean to participate?
In our spring 2023 cohort, we’re looking for up to 40 people to participate in weekly 90-minute virtual workshops as part of our initial 12 workshop series, and may offer an additional in-person or virtual workshop to explore additional topics. While most of the work will take place in the workshop session, there may be one or two optional assignments or self-directed activities included in the series. We plan to start the series the morning of Friday, March 31 with an in-person celebration where participants from the first (fall 2022) and second cohorts will get to know one another. The workshops will then move online starting on Friday, April 14 at 11 am. We will hold sessions nearly every week at that time into July. We’ll send calendar invitations to all participants several weeks before the start of the series.Benefits of participating
As a result of participating in this series, you’llWho is organizing the training series?
The Western Pennsylvania Regional Data Center is organizing this training series. The Regional Data Center is a civic data partnership between the University of Pittsburgh, Allegheny County, and the City of Pittsburgh that works to make information available and accessible and works with partners to apply data for community impact. Through this series, we hope that participants will strengthen their practices, become more-responsible and more-confident stewards of data, and work to dismantle or avoid building oppressive systems.How do I register?
Fill out our form or contact the team at the Western Pennsylvania Regional Data Center via email if you’re interested in participating wprdc@pitt.edu. You can also use the email for questions.What people are saying
Participants in our first cohort offered testimonials about their experience. These are legit actual quotes people offered at the end of the first cohort, and not stuff we made up to try and get you to spend time with us.The Mayor is interested in using city data as an asset to measure performance. Trained as a community organizer, she has also mentioned that she’s a bit wary of adopting surveillance technologies that may put more-vulnerable members of the community at risk of harm. She campaigned on enhancing opportunities for community engagement.To prepare for the meeting, participants in the workshop suggested that they:
Statement | Pre-workshop average score | Post-workshop average score |
I understand the benefits and challenges that come from the use of data and technology. | 4.1 | 4.3 |
I can describe the importance of ethical and just data and technology practices. | 3.9 | 4.6 |
I form my own opinions related to the use of data and technology. | 3.9 | 4.3 |
I can ethically justify decisions I make when it comes to data and technology. | 3.5 | 4.4 |
I am knowledgeable about ethics and justice and am comfortable advising others working in data and technology. | 3.1 | 4.3 |
The first-ever manager of the newly formed Department of Data and Technology in your city worries obsessively about making the wrong decisions when it comes to buying data and technology. You work in this department. While the organization hasn’t made major mistakes yet, the manager is always reading case studies of how other communities have made decisions that have put vulnerable residents at risk, or decisions that have made it more difficult for departments to operate transparently, efficiently, independently, and in-line with their community’s values. Your manager is hoping to build a culture where people in the department feel comfortable asking critical questions about technology. They have developed this workshop, which shares some of the most misfortunate use cases that could be ripped from the headlines. In this workshop, you and your peers will develop questions that could have been asked before harmful data and technologies were purchased. Your director was inspired to draft these scenarios from some of the most-problematic use cases as a guide in what not to do. Through this training, your director hopes that your organization will avoid many of the mistakes that others have already made.One of the scenarios discussed involved the hidden, frequently-exploited labor force behind many of the artificial intelligence (AI) systems in use. Several recent news articles on the topic of AI colonialism told the stories of some of the people that perform important labor in Venezuela and Kenya, and were the inspiration for the following scenario.
A vendor contracted with a neighboring City to purchase AI technology that was used to create a dataset of public infrastructure conditions using imagery. A trusted local human rights organization points out that the company relies on frequently exploited workers from Venezuela to tag and process images that are instrumental to the function of the algorithm. These workers typically act as independent contractors, are usually paid very low wages, and lack the power to object when unscrupulous contractors invent ways to not pay these workers for their labor. The human rights organization has asked your mayor to rescind the contract, but doing so may trigger a lawsuit with the company. What questions could have been asked and what rules and practices could have been made before the contract was awarded that would have prevented this government from buying a product that exploits extremely vulnerable people?In the workshop, participants thought the scenario could have been prevented had the following questions been asked prior to the purchase:
Here are the benefits, risks, and questions that people in one of the breakout groups had developed. What are some of the benefits that can come from this algorithm?Three years ago, East Versailles township contracted with a private vendor to manage tax collection efforts in the hope that an outside company could bring in more revenue than the existing collection process that involved municipal staff. Additional revenue driven by greater compliance would forestall tax increases. After accounting for the cost of the contract, the revenue received by the local government has remained virtually unchanged since the City outsourced this work. The vendor has been growing increasingly worried that the township may decide not to renew the contract with them when it expires in two years. As a result, the company has proposed the use of a proprietary algorithmic tool that, according to marketing materials “uses a range of data from previous returns, public records, and commercial sources (including credit scores)” to flag people whose returns should be audited because the algorithm suggests that they may not have been paying their fair share. Thanks to the township’s new algorithms ordinance, the company must get approval from the commissioners before purchasing the algorithm from the vendor. As a commissioner, you will attend a public hearing on the matter.
“You work at the County and you were asked to make a version of the 911 emergency response data publicly available to cut down on the number of Right to Know requests. The burden of responding to each one of those requests can be substantial.”911 Data includes: