Research Article
14 March 2025

Crafting a Situated Feminist Praxis for Data Regulation in the Age of Artificial Intelligence

Publication: Canadian Journal of Communication
Volume 50, Number 1

Abstract

Abstract

Background: This analysis critically examines the reliance on universalist truths within big data and artificial intelligence (AI) systems. Drawing on Donna Haraway’s critique of objectivity, it challenges the notion that these technologies are neutral, emphasizing their embedded systemic biases.
Analysis: Through a reflection on evolving digital policy frameworks, including the Artificial Intelligence and Data Act (AIDA) and the General Data Protection Regulation (GDPR), this analysis highlights how the prevailing focus on individual rights and self-regulation fails to address the systemic marginalization inherent in AI development. By neglecting structural inequalities, these policies fall short of fostering equity and justice in AI systems.
Conclusions and implications: The study concludes with a framework to establish a situated feminist praxis in AI policy. This twofold approach advocates for 1) recontextualizing AI systems to account for systemic biases and 2) ensuring meaningful participation from diverse marginalized communities in policymaking processes.

Résumé

Contexte : Cette analyse porte un regard critique sur le recours à des vérités universalistes dans les systèmes de mégadonnées et d’intelligence artificielle (IA). S’inspirant de la critique de l’objectivité de Donna Haraway, elle remet en question l’idée que ces technologies soient neutres, en mettant l’accent sur les préjugés systémiques qui caractérisent celles-ci.
Analyse : À travers une réflexion sur les politiques numériques actuelles, notamment la Loi sur l’intelligence artificielle et les données (LIAD) et le Règlement général sur la protection des données (RGPD), cette analyse souligne comment l’accent mis sur les droits individuels et l’autorégulation ne permet pas de remédier à la marginalisation systémique inhérente au développement de l’IA. En négligeant les inégalités structurelles, ces politiques ne parviennent pas à favoriser l’équité et la justice dans les systèmes d’IA.
Conclusions et implications : L’étude se conclut en proposant un cadre pour établir une praxis féministe située relative aux politiques en IA. Il s’agirait d’une double approche qui préconise 1) de recontextualiser les systèmes d’IA pour tenir compte de préjugés systémiques et 2) d’assurer une participation significative des diverses communautés marginalisées dans l’élaboration de politiques pertinentes.

Get full access to this article

View all available purchase options and get full access to this article.

References

Adam, A. (1998). Artificial knowing: Gender and the thinking machine. Routledge.
Andrejevic, M., & Burdon, M. (2014). Defining the sensor society. Television & New Media, 16(1), 19–36.
Bannerman, S., Smith, K. L., Redden, J., Akanbi, O., Maqsood, S., Obar, J. A., & Streeter, T. (2023). Submission to The Standing Committee on Industry and Technology on Bill C-27, An Act to enact the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act and the Artificial Intelligence and Data Act and to make consequential and related amendments to other Acts. https://www.ourcommons.ca/Content/Committee/441/INDU/Brief/BR12793483/br-external/Jointly8-e.pdf
Bates, J., Lin, Y.-W., & Goodale, P. (2016). Data journeys: Capturing the socio-material constitution of data objects and flows. Big Data & Society, 3(2).
Benjamin, R. (2019). Race after technology: Abolitionist tools for the new Jim code (1st ed.). Polity.
Bornakke, T., & Due, B. L. (2018). Big–thick blending: A method for mixing analytical insights from big and thick data sources. Big Data & Society, 5(1).
Bowker, G. C., & Star, S. L. (1999). Sorting things out: Classification and its consequences. MIT Press.
Brandusescu, A. (2021). Artificial intelligence policy and funding in Canada: Public investments, private interests.
Brandusescu, A., & Sieber, R. (2023). Canada’s Artificial Intelligence and Data Act: A missed opportunity for shared prosperity.
Buolamwini, J., & Gebru, T. (2018). Gender shades: Intersectional accuracy disparities in commercial gender classification. Proceedings of the 1st Conference on Fairness, Accountability and Transparency, 81, 77–91. https://proceedings.mlr.press/v81/buolamwini18a.html.
Carmi, E. (2021). A feminist critique to digital consent. Seminar.net, 17(2).
Chun, W. H. K. (2018). Queerying homophily. In C. Apprich, W. H. K. Chun, F. Cramer (Eds.), Pattern discrimination (pp. 59–97). Meson.
Clement, A. (2023). Preliminary analysis of ISED’s C-27 list of 300 stakeholder consultation meetings.
Costanza-Chock, S. (2021). Design justice, AI, and escape from the matrix of domination. In N. Arista (Ed.), Against reduction: Designing a human future with machines (pp. 39–59). MIT Press.
Couldry, N. (2017). The myth of big data. In M. T. Schäfer & K. van Es (Eds.), The datafied society: Studying culture through data. Amsterdam University Press.
Crawford, K. (2021). Atlas of AI: Power, politics, and the planetary costs of artificial intelligence. Yale University Press.
Delacroix, S., & Lawrence, N. (2019). Bottom-up data trusts: Disturbing the ‘one size fits all’ approach to data governance. International Data Privacy Law, 9(4), 236–252.
D’Ignazio, C., & Klein, L. F. (2020). Data feminism. MIT Press.
Eubanks, V. (2018). Automating inequality: How high-tech tools profile, police, and punish the poor. St. Martin’s Press.
European Commission. (2024). Proposal for a Regulation of the European Parliament and of the Council laying down harmonised rules on artificial intelligence (Artificial Intelligence Act) and amending certain Union legislative acts.
European Parliament and Council. (2016). Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation). Official Journal of the European Union, L 119, 1–88.
Fraser, N. (2012). On justice. Lessons from Plato, Rawls and Ishiguro. New Left Review, 74. https://newleftreview.org/II/74/nancy-fraser-on-justice
Gorwa, R. (2019). What is platform governance? Information, Communication & Society, 22(6), 854–871.
Government of Canada. (2022). An Act to enact the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act, and the Artificial Intelligence and Data Act, and to make consequential and related amendments to other Acts (Bill C-27, 44th Parliament, 1st session). https://www.parl.ca/legisinfo/en/bill/44-1/c-27
Haraway, D. (1988). Situated knowledges: The science question in feminism and the privilege of partial perspective. Feminist Studies, 14(3), 575–599.
Harding, S. (1991). Whose science/whose knowledge? Open University Press.
Hope, A., D’Ignazio, C., Hoy, J., Michelson, R., Jennifer, R., Krontiris, K., & Zuckerman, E. (2019). Hackathons as participatory design: Iterating feminist utopias. Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (CHI ‘19). Paper 61, 1–14.
Jones, M., & McKelvey, F. (2024). Deconstructing public participation in the governance of facial recognition technologies in Canada. AI & Society.
Kitchin, R. & Lauriault. T. (2018). Toward Critical Data Studies: Charting and Unpacking Data Assemblages and Their Work. In Thatcher, J., Shears, A., & Eckert, J. (Eds.), Thinking Big Data in Geography: New regimes, new research (pp. 3–20). UNP - Nebraska.
Klein, L., & D’Ignazio, C. (2024). Data feminism for AI. ACM FAccT ‘24, 100–112.
Luka, M. E., & Leurs, K. (2020). Feminist data studies. The International Encyclopedia of Gender, Media and Communication.
McKelvey, F., & MacDonald, M. (2019). Artificial intelligence policy innovations at the Canadian Federal Government. Canadian Journal of Communication, 44(2), 43–50.
Office of the Privacy Commissioner of Canada. (2020, November 12). A regulatory framework for AI: Recommendations for PIPEDA reform. https://www.priv.gc.ca/en/about-the-opc/what-we-do/consultations/completed-consultations/consultation-ai/reg-fw_202011/
Pateman, C. (1989). The disorder of women: Democracy, feminism, and political theory. Stanford University Press.
Pohle, J. (2016). Multistakeholder governance processes as production sites: Enhanced cooperation ‘in the making’. Internet Policy Review, 5(3).
Suárez-Gonzalo, S. (2019). Personal data are political. A feminist view on privacy and big data. Revista de Pensament i Anàlisi, 24(2), 173–192.
Tessono, C., Stevens, Y., Malik, M. M., Solomun, S., Dwivedi, S., & Andrey, S. (2023). AI oversight, accountability and protecting human rights: Comments on Canada’s proposed Artificial Intelligence and Data Act. https://www.ourcommons.ca/Content/Committee/441/INDU/Brief/BR12444167/br-external/CenterForInformationTechnologyPolicy-e.pdf
Thomasen, K., & Kim, R. (2023). Submission to The Standing Committee on Industry and Technology on Bill C-27, An Act to enact the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act and the Artificial Intelligence and Data Act and to make consequential and related amendments to other Acts.
Tkacz, N., Henrique da Mata Martins, M., Porto de Albuquerque, J., Horita, F., & Dolif Neto, G. (2021). Data diaries: A situated approach to the study of data. Big Data & Society, 8(1), 205395172199603.
Toupin, S. (2024). Shaping feminist artificial intelligence. New Media & Society, 26(1), 580–595.

Information & Authors

Information

Published In

Go to Canadian Journal of Communication
Canadian Journal of Communication
Volume 50Number 1March 2025
Pages: 26 - 37

History

Received: 15 March 2024
Revision received: 22 July 2024
Accepted: 12 August 2024
Published in print: March 2025
Published online: 14 March 2025

Keywords:

  1. Artificial Intelligence and Data Act
  2. General Data Protection Regulation
  3. artificial intelligence
  4. policy
  5. feminist AI

Mots clés : 

  1. Loi sur l’intelligence artificielle et les données
  2. Règlement général sur la protection des données
  3. intelligence artificielle
  4. politique
  5. IA féministe

Authors

Affiliations

Laine McCrory
Biography: Laine McCrory is a master’s student in the Joint Program in Communication and Culture at Toronto Metropolitan University and York University. Email: [email protected]
Toronto Metropolitan University and York University, Toronto, Ontario, Canada

Notes

Suggested citation: McCrory, L. (2025). Crafting a situated feminist praxis for data regulation in the age of artificial intelligence. Canadian Journal of Communication, 50(1), 26–37.
Data accessibility: All data will not be made publicly available. Researchers who require access to the study data can contact the corresponding author for further information.
Funding: The author received no funding for this research.
Disclosures: The author has no conflicts of interest to disclose.

Metrics & Citations

Metrics

VIEW ALL METRICS

Related Content

Citations

If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. Simply select your manager software from the list below and click Download.

Format





Download article citation data for:
Laine McCrory
Canadian Journal of Communication 2025 50:1, 26-37

View Options

Restore your content access

Enter your email address to restore your content access:

Note: This functionality works only for purchases done as a guest. If you already have an account, log in to access the content to which you are entitled.

View options

PDF

View PDF

EPUB

View EPUB

Full Text

View Full Text

Figures

Tables

Media

Share

Share

Copy the content Link

Share on social media

About Cookies On This Site

We use cookies to improve user experience on our website and measure the impact of our content.

Learn more

×