Crowding Out The Noise: Algorithmic Collective Action Under Differential Privacy

arXiv:2505.05707v2 Announce Type: replace Abstract: The integration of AI into daily life has generated considerable attention and excitement, while also raising concerns about automating algorithmic harms and re-entrenching existing social inequities. While the responsible deployment of trustworthy AI systems is a worthy goal, there are many possible ways to realize it, from policy and regulation to improved algorithm design and evaluation. In fact, since AI trains on social data, there is even a possibility for everyday users, citizens, or workers to directly steer the AI system's behavior through Algorithmic Collective Action, by deliberately modifying the data they share with a platform to drive its learning process in their favor. This paper considers how these grassroots efforts to influence AI interact with methods used by AI firms and governments to improve model trustworthiness. In particular, we focus on the setting where the AI firm deploys a differentially private model, motivated by the growing regulatory focus on privacy and data protection. We investigate how the use of Differentially Private Stochastic Gradient Descent (DP-SGD) affects the collective's ability to influence the learning process. Our findings show that while differential privacy protects individual data, it introduces challenges for effective algorithmic collective action. We establish this trade-off formally by characterizing lower bounds on the success of algorithmic collective action under differential privacy as a function of the collective's size and the firm's privacy parameters. We then verify these trends experimentally by simulating collective action during the training of deep neural network classifiers across several datasets. Finally, we perform a stylized economic analysis of privacy costs to integrate additional incentives, analyzing how utility and participation costs influence the formation of collectives under private training regimes.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top