Designing AI: The Feminist Way

An overview of projects and artworks that tackle the issues related to AI Design and promote principles of equity and inclusivity

Designing AI: The Feminist Way

By Beth Jochim, Creative AI Lead at Libre AI

“Technology is neither good nor bad; nor is it neutral.” - Melvin Kranzberg

“Data is the new oil.” [1] The expression, attributed to the British businessman and mathematician Clive Humby, draws a comparison between big data and oil, asserting that to use these resources in a profitable way, first they must be transformed from their raw state into a refined product.

Although the expression has found widespread use, it has not been accepted by all as relevant for several reasons. One of these is the fact that while oil is a natural resource destined to run out, the quantity and availability of data is going in the opposite direction, that is, it is constantly increasing. Another is the fact that extrapolating facts is not the same as gaining insights on which to base informed decisions. Ronald Schmelzer talked about this in his article for Forbes [2], where he dismantled Humby's saying through precise reasoning.

We live in a society that is increasingly relying on machine learning and artificial intelligence, areas known for their insatiable hunger for data with which to train models. If a similarity can therefore be found it would be in the elites that control these sectors. Schmelzer, like others, pointed out that only a few companies control AI-based technologies - “the oil barons of the information economy are those who are doing it best - the Googles, Facebooks, Amazons, Microsofts, and their ilk of the world”. This therefore repeats what has already happened in the oil sector. A critical difference between the two fields, however, would lie in the fact that any organization capable of arranging a strategic collection and processing of structured and unstructured data, over time can gain insights for its digital transformation.

Technologist, visual artist, and writer James Bridle, in “Data isn’t the new oil — it’s the new nuclear power” [3], takes up the concept of Humby, exposing the dark side of data. The journalist writes: “Our thirst for data, like our thirst for oil, is historically imperialist and colonialist, and it’s tightly tied to capitalist networks of exploitation” and keeps going saying that ”Data-driven regimes repeat the racist, sexist and oppressive policies of their antecedents because these biases and attitudes have been encoded into them at the root.” Bridle focuses not only on aspects of the exploitation of environmental resources to extract and refine data (exactly as happens with oil), but above all on the social poisoning that can result from the imbalances of power between the actors involved, arguing that when power is call into play, data is extracted by force rather than freely given by the population.

Big data led to progress in many sectors, like in the medical domain, in transportation or in finance, but has also created social tensions and disparities, establishing a factual state of control in the digital and not digital world. Treating data like a commodity appears therefore problematic, if not dangerous.

Catherine D'Ignazio, Assistant Professor of Urban Science and Planning in the Department of Urban Studies and Planning at MIT, and Lauren F. Klein, Associate Professor of English and Quantitative Theory and Methods at Emory University, addressed the thorny issue that sees the field of data science and big data dominated by “white, male, and techno-heroic” figures. In their book “Data Feminism” (2020), the scholars focused on critical issues related to the world of data (starting from how data are collected, stored, manipulated and visualized, up to the working groups that analyze them and make decisions based on them) and offered a strategy. Fostering aspects of social justice, the authors introduced an intersectional feminist thought inspired by the Black Feminism born in the United States with Kimberle Crenshaw, Patricia Hill Collins, and the Combahee River Collective. The book is aimed at a varied audience that includes people who belong to the technical and research world (such as in the field of computer science, data science, women's and gender studies), but also at designers, journalists, and all those who deal with information and communication. [4]

In recent years, especially with the advent of generative adversarial networks (GANs) designed by machine learning expert Ian Goodfellow and colleagues (2014), deep learning has made its way into the art world, bringing artists to work directly with large amounts of data. The entanglement of cutting-edge technology and art has shown the potential of deep neural networks in generating imagery, but has also accelerated a series of ethical and philosophical reflections on artificial intelligence, bringing to light issues related to deep learning technology, such as biases and dynamics of power.

The claim [5] by Computer Scientist and Facebook Chief AI Scientist Yann LeCun that biases are contained only in the data has sparked a reaction in the scientific community. The latter took the opportunity to dig into the much more complex reality that involves the design of an AI system, recognizing not only in the data, but also in the annotation, in the models and in the design, crucial passages in which prejudices can infiltrate.

To counter the omnipresent male figure in both the artistic and technological ecosystems, artist and designer Mary Flanagan created, through a trilogy of works, a Feminist AI. As presented by the artist, the project addresses the frequent problem that sees women and people of color "left out of the conversation in the making, the structuring, the producing of the code, and of the creations that AI can engender.” The work takes its cue and further develops the philosophical reflections on technology proposed by writer Mary Shelley in the book “Frankenstein”, starting a discussion on the myth of Prometheus and on the risks of challenging the limits imposed on us by nature.

[Grace: AI] is a system based on a Deep Convolutional Generative Adversarial Network (DCGAN) trained on an ever expanding dataset of tens of thousands of works exclusively made by female artists chosen one by one by Flanagan. According to the artist, the construction of the database represented a real challenge, forcing her to scrap web sources one by one because art databases often do not allow to search for information on the sex or gender of the artist. In creating such a large database, Flanagan worked with both the Met and the National Museum of Women in the Arts and the University of Indiana.

Mary Flanagan's talk at the NeMe Arts Centre during the 'Children of Prometheus' exhibition. For more info on the project: neme.org/projects/children-of-prometheus. Source.

The project includes [Grace:AI] – Origin Story (Frankenstein), 2019, [Grace:AI] – Daydreams, 2020, and [Grace:AI] - Portraits, 2020, and reflects on aspects related to gender, feminism, and to the nature of art. Starting from images generated by the father figure Frankenstein, the system then learns to daydream and finally gets to produce portraits and landscapes, which are the result of its intimate experiences.

The need to challenge a technological world that discriminates against minorities, limits female participation, lacks diversity in working groups, and does not support alternative ways of thinking, is at the heart of Feminist Internet work by Internet equality advocate Charlotte Webb. Defined as a “non-profit organisation on a mission to make the internet a more equal space for women and other marginalised groups through creative, critical practice", Feminist Internet revolves around the concepts of participation and inclusion.

Among the projects, F’xa (2019) is a playful chatbot that helps the user to navigate the world of biased AI and suggests ways to mitigate the prejudices embedded in everyday technology. A heterogeneous group took care of its design to ensure the widest possible representation of views. The work was based on Josie Young's research in ethical and feminist AI (specifically on the Feminist Chatbot Design Process (FCDP)), and used the Feminist PIA (Personal Intelligent Assistant) standards, adapted from Young's (Designing a Feminist Chatbot Framework, 2017).

Designing The Feminist Internet, Charlotte Webb, TEDxBarcelonaWomen, 2019. Source.

The role of PIAs is to help AI designers to incorporate feminist values ​​(understood as principles of social justice and equality) and to avoid perpetuating prejudices that can creep into the different phases of creating a chatbot. Through the chatbot, users learn what artificial intelligence is and what feminism means (including its plurality of definitions), coming into contact with the most common problems that afflict today's AI-based systems. In an interview with Dazed, Webb underlined how the project's goal is to educate the next generation of designers, making sure they develop a mindset aligned with ethical principles to bring to the company.

Inclusivity and diversity are at the base also of the art project Women Reclaiming AI (WRAI), 2019. Artists and technologists Coral Manton and Birgitte Aga, in collaboration with “a growing community of self-identifying women”, have developed a conversational artificial intelligence system that proposes itself as a medium to resist the stereotypes that reinforce traditional gender roles and leave no room for diversity of voices and points of view within our society.

Coral Manton, Women Reclaiming AI, 2018. Source.

WRAI is made by a voice assistant, which is designed by the collective, and by GAN-based generated images of women, which evolve over time. To learn how to generate an alternative imagery, the GAN uses a mixed dataset of images that come from both the women who participated in the project and from female figures considered inspiring. This produced an alternative dataset perceived as more inclusive and representative of the group's values.

Caroline Sinders employs design to investigate the world. As a machine learning-design researcher and artist, her interests span from online harassment to politics, from privacy to data, from artificial intelligence to social networks, and more. The Feminist Data Set (2017 - ongoing) is a project with a critical but artistic vision of machine learning, which questions every phase of the creation of a conversational AI system. From data collection to annotation, from data training to the choice of algorithms and models, the attention of Sinders revolves around designing a system that is (possibly) free of bias, feminist, and intersectional.

Caroline Sinders, AI Is More Than Math: Using Design To Confront Bias, 2019. Source.

In addition to the Feminist Data Set workshops, the work includes also the Feminist Data Set Tool Kit and the open source TRK tool. The project, which has been exhibited in many venues including Ars Electronica and the Victoria and Albert Museum, is considered by the artist "an artistic practice aimed at the public and social justice". It opposes the technological domination of large companies through a conscious creation of data, datasets, and archiving. The work is carried out through public forums and workshops that involve the participants in a conscious way and is guided by a community spirit that carries on a political ideology of resistance to Big Tech.

Ars Electronica has activated several initiatives to increase the visibility of women working in the field of Media Art. One of these is the "Women in Media Arts" database, which is constantly updated and expanded and is accessible through their Online Archive. The database serves as a research platform for all those interested in the topic of Women in Media Art (more information can be found here). [6]

Outside the artistic sphere, there are also initiatives that promote inclusiveness. Last, but not least, is the collaboration between digital activist and computer scientist Joy Buolamwini with the beauty brand Olay. The #DecodeTheBias campaign aims not only at tackling racial bias in facial recognition systems (specifically, bias in search engines and algorithms that decide the standards of female beauty), but also at closing the gender gap in STEM by promoting a program that sees the number of female participants double and the number of women of color triple by 2030. [7]

Art and technology offer together the opportunity to understand not only how complex systems work, but also a means to become more aware of the problems associated with technology. Therefore, it is not only up to designers, researchers or artists, but also to all of us to become aware of such issues and be more in control of our digital and non-digital lives.


References

  1. Palmer, Michael (2006). “Data is the New Oil.” ANA Blogs, retrieved from: https://ana.blogs.com/maestros/2006/11/data_is_the_new.html in September 2021.
  2. Schmelzer, Ronald (2020). “Unleashing The Real Power Of Data.” Forbes, Retrieved from: https://www.forbes.com/sites/cognitiveworld/2020/02/06/unleashing-the-real-power-of-data/?sh=687b7d2e389d in September 2021.
  3. Bridle, James (2018). “Opinion: Data isn’t the new oil — it’s the new nuclear power.” ideas.ted.com, retrieved from: https://ideas.ted.com/opinion-data-isnt-the-new-oil-its-the-new-nuclear-power/ in September 2021.
  4. Forrest, Jason (2019). “None Of Us Are Free If Some Of Us Are Not: Catherine D’Ignazio on Data Feminism”. Medium, retrieved from: https://medium.com/nightingale/catherine-dignazio-on-data-feminism-ce3b3c65f04a in September 2021
  5. LeCun, Yann (2020). Twitter, retrieved from: https://twitter.com/ylecun/status/1274782757907030016?lang=en in September 2021
  6. Grubauer, Anna (2020). "Women in Media Arts: Does AI think like a (white) man?". Ars Electronica Blog, retrieved from: https://ars.electronica.art/aeblog/en/2020/04/10/women-in-media-arts-ai/ in September 2021
  7. Browley, Jasmine (2021). "Olay Teams Up with Data Scientist to Help Tackle Racial Bias Against Black Women in New Initiative". Essence, retrieved from: https://www.essence.com/news/money-career/olay-partnership-joy-buolamwini/ in September 2021

Photo (Cover) by ev / Unsplash