Jump to content

User:ASummer10/sandbox

From Wikipedia, the free encyclopedia

Epistemology

[edit]

The phrase “Data Justice” was described as a bridging model between technology justice and social justice in 2016 by Lina Dencik, Arne Hintz, and Jonathan Cable.[1] It emerged as a response to a disconnect between the two concepts and has become a widely cited conceptual framework in studies on governance.[2] Previously, scholarship on data justice was limited, published mostly after 2000, and was used across disciplines including the social sciences, medicine, and engineering.[3]

Data justice has gained prominence in response to large-scale data collection initiatives, particularly with surveillance programs as they relate to policing and citizenship.[4][5]

Data Justice

[edit]

Data justice is a conceptual framework addressing fairness in the way data is collected, accessed and used.[6] It emerges from concerns about how data systems can reinforce or mitigate social inequalities.[6] The concept is grounded in the intersection of data ethics, social justice, and critical data studies.[7] It critiques the underlying algorithmic systems and data infrastructures, including who designs them, who benefits from them, and who is excluded.[6] Scholars used “Pedagogy of the Oppressed” to inform data justice, because of its focus on structural inequalities.[3]

As technology adoption rises globally, data justice scholars emphasize the importance of developing a global understanding of just data use.[3] The term has a relational nature: it is social justice as a response to datafication.[3]

Data justice posits that the economy of datafication and politics are intertwined, and movements toward data justice must include civic participation. [8] Technocratic and privatized data collection can result in civic disengagement and advancement of pre-existing power structures.[9] Data justice examines structural inequalities that manifest in the experiences of data use across the globe.[6] It questions if current data governance structures exploit or restrict marginalized communities, and how they may affect individuals sense of agency, autonomy, and representation.[6]

Theoretical Foundations

[edit]

Data justice has been shaped by interdisciplinary scholarship in fields such as post-colonial theory, digital capitalism and political theory.[3] Scholars such as Taylor, Dencik, and Hintz have explored how data governance intersects with structural power, marginalization, and civic agency.

Taylor has proposed dividing the data justice framework into three pillars: (in)visibility, (dis)engagement with technology, and nondiscrimination.[6]

(In)visibility

[edit]

(In)visibility refers to privacy and representation.[6] It questions if data is understood by citizens as a commodity and the different facets of information privacy.[6]

(Dis)engagement

[edit]

(Dis)engagement promotes autonomy in technology use, such as the freedom to not use it, and also agency for individuals to dictate the utility of their own data.[6]

Nondiscrimination

[edit]

Nondiscrimination challenges the embedded historical biases in data processing, especially how it manifests in surveillance systems.[6] Taylor argues that as computer systems get more advanced, even the creators themselves will be unable to identify how the systems have perpetuated bias, making challenging it more difficult.[6]


Richard Heeks’ theoretical framework divides data justice into three dimensions:

Instrumental Data Justice

[edit]

Instrumental data justice revolves around the outcome of data use.[7] In focusing on the outcome, this approach takes a utilitarian stance; the utility of the economic or individual benefit underlines that it is not the collection of the data itself that is unjust, but rather the results of its use which may or may not be unjust.[7] Instrumental data justice suggests that data itself, and the collection of it cannot be inherently just or unjust, and injustices only arise from a violation of duty of care.[7]

Procedural Data Justice

[edit]

Procedural data justice focuses on the processes by which data is collected, categorized, and applied.[7] It emphasizes transparency, participation, and the right to contest data-driven decisions.[7] Procedural data justice advocates posit that collecting data should require explicit or implied consent.[7] Explicit consent requires the consenting of each particular act of data handling, and implied consent can be given while accessing a particular service like a phone plan or taking a photograph.[7] Both include the right to withdraw consent.[3] Procedural data justice also requires that the perception of the collection should be fair according to its subjects.[7] Fairness is associated with control over the processes[7] Heeks further explores fairness by expanding the definition categorically to include the criteria of consistency, correctness, and correct-ability.[7] If the process of the data collection is done in a consistent, accurate, and objective way (correctness), subjects can perceive the process as fair.[7]

Distributive Data Justice

[edit]

Distributive data justice concerns who has access to data and how the benefits and burdens of datafication are shared.[7] It highlights the asymmetries between those who generate data and the institutions or corporations that control it.[7] Distributive data justice concerns data privacy, ownership, and representation.[7] As data creators, individuals are often unaware that their information is being collected.[7] Distributive data justice aims to return the control of data to its producer.

Critiques and Theoretical Debates

[edit]

Data justice scholars argue that structural inequalities are embedded in the foundations of data collection and governance. Critics have examined how data systems reproduce existing power structures, often prioritizing institutional or commercial interests and imposing normative assumptions about identity, value, and knowledge.

Power and Data Extraction

[edit]

Large technology companies play a central role in shaping contemporary data systems, frequently operating with limited oversight.[10] Scholars such as Taylor argue that corporate data practices prioritize profitability over public interest, leading to exploitative forms of data extraction.[6] This has led to comparisons with resource extraction models, where data is treated like other natural resources.[11] Subsequently, data is mined, commodified, and monetized without adequate return or recognition to those it is sourced from.[7] This critique is closely linked to the concept of data colonialism, which describes the appropriation of data from marginalized populations, particularly in the Global South, without consent or benefit.[6][3] Scholars such as Azadeh Akbari and Taylor suggest that these practices mirror historical patterns of imperialism and economic domination, extending colonial logic’s into the digital realm.[6][3]

Governmentality and Biopower

[edit]

Theoretical debates in data justice draw on Michel Foucault’s concepts of governmentality and biopower to examine how data systems enact control. Governmentality refers to the ways institutions shape individuals into governable subjects through administrative processes, norms, and surveillance.[12] Scholars such as Aaron Martin and Taylor argue that digital infrastructures not only collect data but also help construct behavioral norms and define the roles individuals occupy in society.[13] These infrastructures determine who receives public services and who is surveilled.[13] Through this system, individuals are not represented through data but are instead governed by it.[13]

The related term biopower describes the management of life through institutions.[14] In this context, biopower operates through algorithmic profiling and classification, regulating access to resources and shaping social outcomes by reinforcing normative standards of identity, productivity, and visibility.[13][7]

In this debate, data justice involves confronting the outcomes of data use and the underlying structures of power that shape how data is produced and used.

Visibility and Invisibility

[edit]

Data systems determine what is known, what other institutions can see, and under what terms.[6] Visibility in this context can offer recognition, access to services, and institutional support.[7] It can also bring intensified surveillance, control, and exclusion due to lack of representation.[6] Invisibility may protect individuals from scrutiny but can also result in marginalization or denial of rights.[7]

Scholars such as Taylor have emphasized that visibility through data is not inherently empowering. Data systems frequently reflect the interests of those who collect and manage data rather than those represented within it.[6] The terms under which individuals become visible are often externally imposed, uneven, and shaped by institutional or commercial priorities.[7] These dynamics are found in policing, and refugee and migration systems, where being visible in data can determine eligibility for aid.[7] In these cases, the costs and benefits of data visibility are uneven, reinforcing social hierarchies.[7]

Data justice critiques also point to the invisibility of data governance itself. While individuals are increasingly exposed through data collections, the process by which it occurs remains opaque.[6] As a result, it is difficult for people to understand how their data is used or to challenge unfair outcomes.[6] In this debate, data justice seeks to interrogate both how visibility is constructed and who controls the mechanisms.

Individual and Collective Rights

[edit]

A reoccurring tension in data justice lies between individual rights-based frameworks and the collective nature of data harms and governance.[15] Traditional models of data protection have been grounded in the Universal Declaration of Human Rights, which emphasized privacy, ownership, and consent.[6] However, scholars such as Akbari and Taylor have critiqued this orientation as insufficient and neoliberal.[6][3] According to the critique, the model requires visibility and places the burden on the individual, even though data driven harms occur at the systemic level.[3][6]

This critique is relevant to contexts where data is used to profile communities or predict behavior. Harm could occur without a single individual being targeted or aware of the consequences.[7] As a result, scholars have advocated for a reorientation toward collective data governance models, that account for shared responsibilities, communal rights, and group-based harms.[6] These approaches aim to align data justice more closely with structural understandings of inequality and social power.

The Panopticon and Surveillance Assemblages

[edit]

Data justice theorists have also revisited the concept of the panopticon, originally developed by Jeremy Bentham and later expanded by Michel Foucault as a model of disciplinary surveillance.[14] While influential in early surveillance theory, scholars such as Taylor and Dencik et al., argue that the panopticon is no longer an accurate metaphor for contemporary data surveillance systems.[6][16] In contrast to the centralized, visible surveillance imagined in the panopticon, modern data surveillance is decentralized, opaque, and multi-platform.[6] Corporations, governments, and third-party data brokers collect and process user data for varied purposes, often without the knowledge or consent of those being observed.[6] Unlike the panopticon model, users do not consciously self-censor their behavior in response to being watched.[16]

To account for this, Dencik et al., propose the concept of a ‘veillant panopticon assemblage,’ which reflects the mutual, participatory, and distributed nature of modern surveillance.[16] In this assemblage, surveillance is no longer strictly top-down but instead consists of distributed observation, in which users may simultaneously be subjects and agents of surveillance.[16] This expanded framework allows data justice scholars to analyze the complexities of power, agency, and accountability in digitally mediated surveillance environments.

Applications and Case Studies

[edit]

Data justice has practical implications across a range of sectors where data technologies intersect with governance, surveillance, and everyday life. Scholars and activists have emphasized that data injustice follows pre-existing structures of discriminations; it unevenly targets racialized, poverty-stricken, and other marginalized communities.[6] Taylor connects data justice to issues of climate justice, terrorism, and poverty, arguing that data technologies operate according to a narrow, middle class standard.[6] As a result, the technologies do not account for the lived realities and conditions experienced by those outside normative data sets.[6]

Surveillance and Predictive Policing

[edit]

The integration of data analysts into policing practices is a focal point in data justice discourse. Predictive policing refers to the use of algorithms and historical crime data to forecast future crime patterns geographically, and risk scoring which attempts to predict the likelihood of individuals (re)offending or becoming victims.[10]

Critics argue that predictive policing replicates biases embedded in historical crime data to forecast future criminal activity.[17] Since the data often reflects disproportionate surveillance of racialized and low-income communities, predictive algorithms may reinforce patterns of over-policing, rather than correcting these injustices.[17] Additionally, concerns have been raised about the accuracy and fairness of predictive models. Kate Crawford has highlighted how flawed or incomplete data sets can skew results, leading to misleading risk assessments.[17] Additionally, the implicit biases of the system designers may be embedded into the models.[6] Scholars have also argued that predictive policing has not demonstrated consistent effectiveness in preventing crime.[9]

Public perceptions of surveillance are also relevant to this application. research by Dencik et al., suggests that many individuals believe surveillance primarily targets those engaged in wrong-doing.[16] However, their interviews with political activists revealed that although these individuals were aware they were being monitored, they did not view their actions as unlawful or threatening.[16] This underscores broader concerns in data justice about how ambiguous definitions of deviance and opaque surveillance systems can undermine democratic expression and civic participation.

Activism and Alternative Models

[edit]

The Right to the City and Digital Resistance

[edit]

The concept of the right to the city has been adopted by some data justice scholars and activists as a political framework for reclaiming control over urban data environments. Originally developed by Henri Lefebvre, the right to the city emphasizes collective participation in designing and governing urban life.[9] In the context of data justice, this framework has been extended to address the datafied city, where urban management is increasingly shaped by data-driven technologies and digital surveillance systems.

Currie et al., argue that as cities undergo technological transformation, there is a growing need for democratic participation in digital decision-making.[9] They critique the privatization of urban data systems, wherein private sector actors control the collection, storage, and use of public sector data generated for public institutions.[9] This shift, they suggest, risks advancing technocratic models of governance that prioritize economic efficiency over equity and accountability, reinforcing existing social inequalities by limiting public oversight.[9]

Data justice movements aligns with the right to the city call for the democratization of digital space. It argues that citizens should have meaningful input in how their data is collected, interpreted, and used to shape policy.[9] This approach positions data not simply as a technical asset, but as a site of political negotiation, where control, participation, and justice are actively contested.


References

[edit]
  1. ^ Dencik, Lina; Hintz, Arne; Cable, Jonathan (24 November 2016). "Towards data justice? The ambiguity of anti-surveillance resistance in political activism". Big Data & Society. 3 (1). doi:10.1177/2053951716679678.
  2. ^ Akbari, Azadeh (6 December 2024). "The politics of data justice: exit, voice, or rehumanisation?". Information, Communication & Society (17): 1–17. doi:10.1080/1369118X.2024.2437015.
  3. ^ a b c d e f g h i j Cite error: The named reference “Akbari” was invoked but never defined (see the help page).
  4. ^ Heeks, Richard; Renken, Jaco (19 November 2016). "Data justice for development: What would it mean?". Information Development. 34 (1): 90–102. doi:10.1177/0266666916678282.
  5. ^ Taylor, Linnet (1 November 2016). "What is data justice? The case for connecting digital rights and freedoms globally". Big data & society. 4 (2): 1–12. doi:10.1177/2053951717736335.
  6. ^ a b c d e f g h i j k l m n o p q r s t u v w x y z aa ab ac ad Cite error: The named reference “Taylor” was invoked but never defined (see the help page).
  7. ^ a b c d e f g h i j k l m n o p q r s t u v w x Cite error: The named reference “Heeks” was invoked but never defined (see the help page).
  8. ^ Currie, Morgan; Knox, Jeremy; McGregor, Callum (2022). "Introduction". In Currie, Morgan (ed.). Data Justice and the Right to the City. Edinburgh: Edinburgh University Press. ISBN 9781474492973. Retrieved 23 March 2025.
  9. ^ a b c d e f g Cite error: The named reference “RTTCIntro” was invoked but never defined (see the help page).
  10. ^ a b Feike, Jansen (2022). "Predictive Policing:Transforming the city into a medium for control". Data Justice and the Right to the City. Edinburgh: Edinburgh University Press. ISBN 9781474492973.
  11. ^ Martin, Aaron; Taylor, Linnet (29 August 2020). "Exclusion and inclusion in identification: regulation, displacement and data justice". Information Technology for Development. 27 (1): 50–66. doi:10.1080/02681102.2020.
  12. ^ Gutting, Gary; Oksala, Johanna (Fall 2022). "Michel Foucault". Stanford Encyclopedia of Philosophy. Retrieved 1 April 2025.
  13. ^ a b c d Cite error: The named reference “Martin” was invoked but never defined (see the help page).
  14. ^ a b Cite error: The named reference “Foucault” was invoked but never defined (see the help page).
  15. ^ Dencik, Lina; Hintz, Arne; Redden, Joanna; Treré, Emiliano (13 March 2025). "Collectivity in data governance and data justice". Information, Communication, & Society: 1–8. doi:10.1080/1369118X.2025.2478096.
  16. ^ a b c d e f Cite error: The named reference “DencikS” was invoked but never defined (see the help page).
  17. ^ a b c Crawford, Kate (2021). Atlas of AI: Power, Politics, and the Planetary Costs of Artificial Intelligence. New Haven: Yale University Press. ISBN 9780300252392.