Data Altruism by Default: An Alternative to Consent for Personal Data Processing in Machine Learning
// WINNER
download pdf
This paper is not about data altruism in the lexicological or grammatical sense of the word. It is also not about data altruism in the Data Governance Act sense of the word. This article is about carving out an exception for certain types of data processing conducted for the development of AI systems. All in the attempt to allow the development of high-quality AI systems while striking an appropriate balance between the developers of such systems and the data subjects providing the data for their development. This paper aims to assist in confronting these challenges by critically examining the monetization of personal data obtained through dubious and largely unlawful practices. Labelled provisionally as data altruism, such invasive data practices deny the users any real choice. On the other hand, data altruism proposed in the Data Governance Acts does little to resolve the existing issues as the provided alternative is overly limited in its scope. In contrast to this, the paper provocatively proposes a novel framework which would justify the necessary processing practices while imposing objective and enforceable requirements on those engaging in them. In light of this, the paper underscores the need for robust safeguards and highlights the importance of objective limitations to associated business models. The proposed version of data altruism is crucially different then both previously examined, it should, however, not be understood as a solution but rather as a provocative idea stimulating meaningful discussion at both the societal and political level.