Special Section:
Probing the System: Feminist Complications of Automated Technologies, Flows,
and
Practices of Everyday Life
Introduction
Probing the System: Feminist Complications of Automated Technologies, Flows, and Practices of Everyday Life
McMaster University
gardnerp@mcmaster.ca
Goldsmiths, University of London
s.kember@gold.ac.uk
Abstract
This Special Section presents diverse scholarly voices examining the silenced, underexposed, intersectional forces that fortify science and technology platforms in their work to automate public abidance. The articles probe, from diverse global locations and perspectives, the contemporary work of various “platforms,” understood broadly as technology and software, health, social media, and policy platforms. The articles probe these systems and platforms with attention to the assumptions and practices embedded in their algorithms, protocols, design specifications, and communications, and, in turn, the political, cultural, governance, and mediated practices they make possible. The research studies and practice-based work herein expose the complex and shifting sociopolitical codes and contexts that condition technology, artificial intelligence (AI), surveillance, health, social media, and state platforms that support systems of care, news, communication, and governance. These exposures show how platform craftiness works differently in different spaces to privilege and damage, often with ghostly obscurity. Attentive to how platforms operate in complex contemporary viral modes, the section seeks to locate and expose these traces, draped in what communication scholars Sangeet Kumar and Radhika Parameswaran (2018, 345) refer to as “chameleon cultural codes” that, in changing and transforming into unrecognizable forms, feed global imaginaries.
Keywords
platforms, systems, automation, intervention, feminist science and technology studies, HCI
When, before the advent of COVID-19, we first imagined this Special Section of Catalyst, we hoped to bring together different areas of feminist scholarship that conceptualize systems, along with the platforms that operationalize them, as political, networked, automating technologies that proscribe distinct practices and futures. As communication scholars who have long played, tarried, and tangled with science and technology studies (STS) feminism, we hoped to expose our common ground and divergences in system probing (by which we mean analysis and intervention) in order to pressure current conceptualizations and imagine sustainable futures. Articles came in as the COVID-19 pandemic emerged across the globe in the early months of 2020, newly exposing the uneven ground that enframes technology platforms, and reflecting economic and sociopolitical priorities that revert care, resources, and access to the always, already prioritized. It became painfully visible that stay-at-home orders, vaccine nationalism, and travel restrictions privileged the privileged. Health policies arose reifying postcapitalist economic, gender, race, class and caste elitism, and settler colonial and empire-building values, naturalizing the uneven flows of vaccine regionally and transnationally. Health policy platforms worked to automate public adherence to a politics of health disparity.
As section editors, we recognized the need to vividly illustrate everyday life in this pandemic moment to arm future readers to understand how biased practices of prevention, treatment, access, and care are amplified during this period where systems are transformed in response to crisis. As national health and commerce platforms and policies morphed, they have tended to shelter the privileged and safe-guard industrial output to secure global trade and northern prosperity. Complaints that public health initiatives and platforms are grounded in white supremacy, colonial tropes, and racial and ethnic bias have often been side-lined by health officials. They tend to be focused on immediate responses limited by the narrow register of the UN World Health Organization and by the COVAX relief effort. Pandemic platform practices bring existing disparities into sharp relief, as argued by Ruha Benjamin (2019, 2020), who urges us to engage in viral justice to tackle hostile racial climates and manifest health systems that are more just and habitable.
This Special Section presents diverse scholarly voices examining the silenced, underexposed, intersectional forces that fortify science and technology platforms in their work to automate public abidance. The articles probe, from diverse global locations and perspectives, the contemporary work of various “platforms,” understood broadly as technology and software, health, social media, and policy platforms. The articles probe these systems and platforms with attention to the assumptions and practices embedded in their algorithms, protocols, design specifications, and communications, and, in turn, the political, cultural, governance, and mediated practices they make possible. The research studies and practice-based work herein expose the complex and shifting sociopolitical codes and contexts that condition technology, artificial intelligence (AI), surveillance, health, social media, and state platforms that support systems of care, news, communication, and governance. These exposures show how platform craftiness works differently in different spaces to privilege and damage, often with ghostly obscurity. Attentive to how platforms operate in complex contemporary viral modes, the section seeks to locate and expose these traces, draped in what communication scholars Sangeet Kumar and Radhika Parameswaran (2018, 345) refer to as “chameleon cultural codes” that, in changing and transforming into unrecognizable forms, feed global imaginaries.
Foregrounding feminist theory from STS, communication and media studies, feminist human-computer interaction (HCI), design studies, and digital humanities, this work deploys intersectional analysis to interrogate constrained system logics and expose opaque, biased algorithms and rules, and the wily means by which platforms morph toward power to produce seemingly objective, automated data output and practices. Automated system logic is pervasive and pernicious; it has been shown to be binary, hierarchal, governmental, causal, logocentric, and to normalize preordained outputs, produce rule compliant users, and laud transhumanism. Bred in postcapital, empire-fortified spaces, automation enacts and reinforces a succinct set of neoliberal values—efficiency, compliance and transparency—and carries legacies of gender, racial, ethnic, and nationalist bias.
This Special Section builds on diverse feminist scholarship on systems in STS and communication studies and other interdisciplinary spaces (e.g., Barad 2007; Chun 2009a, 2009b; Coleman 2009, Grosz 2005; Hayles 1999; Kember 2013; Munster 2006; and Suchman 2007). This expansive work has unpacked pernicious system practices, demonstrating how automated systems prohibit interactive human-system communication by encouraging data valorization, blind acceptance of system flow and impact, and quick fix, reductive responses to complex problems. To counter the pressures and practices arising from automation, diverse feminist scholars have employed a distributed logic or multi-agential approach to disentangle relationships binding technologies and the human subject.
What is the value of feminism in this terrain? Where HCI and communication scholars of the interface, system, and network decry the problematics of lauding code or linear logic, feminist scholars use diverse approaches that interrogate the communication and network logics that prohibit platform intervention. Lucy Suchman (2007), for example, shows the importance of revealing system logic in order to exploit it in informed, interactive engagement. At the same time, N. Katherine Hayles (1991) demonstrates how science and technology modular logics are problematic when deployed as the whole story, precluding evaluation of how complexity within systems constrain data. Drawing upon Karen Barad (2007), feminist scholars have pushed metaphors of the network and communication to understand how data or materials move according to system rules and show how system complexity causes productive interferences. Much of this systems analysis presumes Donna Haraway’s (1988) situated knowledges approach, recognizing that knowledges and systems reflect the social locations and identities that produce them. Haraway credits this concept to insights from “third-world feminism,” derived from lived experiences strategically and negotiating various and intersecting forms of oppression. The foundational feminist critical race, postcolonial, and transnational scholarship of Shulamith Firestone (1970), the Combahee Women’s Collective ([1977] 2001), Audre Lorde ([1984] 2018), bell hooks ([1984] 2015), Patricia Hill Collins (1986), and Gloria Anzaldúa ([1987] 2012), among others, informed Haraway’s work offering insights into the operation of power in systems, and lessons for creative intervention. Since that time, Ruha Benjamin, Beth Coleman, Wendy Hui Kyong Chun, Lisa Nakamura, and others have expounded upon the interrelations of race and technical systems, elaborating nuanced feminist analyses of how systems produce culture, community, publics, and media, therein reproducing gendered, racist biases—research that this Special Section builds upon.
Feminist scholars have offered innovative methods that both expose and exploit the complexity of systems. In theorizing how interactivity and intra-activity manifests in the networked flows within and across systems, feminist scholars such as Haraway, Hayles, Barad, and Chela Sandoval (2000) complicate how complexity operates. A distributed systems logic approach (Haraway 1990) encourages accessible logics and algorithms and actualizes complex modes of communication to elicit transgressive practices with media and with technologies. Distributed logics work to make visible a system’s own transgressive power, and reject simpler human-centric ontological network assumptions (e.g., grounded in binary logics, presuming a unified self) and premises of constrained relationality that produce what Haraway terms the “informatics of domination” (1990, 300)—a world grounded in authority, rationality, and computation. From the vantage point of distributed logics, we can see diffractions, or patterns of difference, rather than (hegemonic) reflections born of fixed positions. This approach reveals diverse paths, positions subjectivity as crucial to intervention and encourages system hacking.
Commonly, feminist scholars argue that the embodied human is better understood in systems envisioned as chaotic, where principles and rules are cut (Kember and Zylinska 2012) and bend (Barad 2007). Elizabeth Grosz (2005) engages a Deleuzian approach, challenging scholars to exploit the internal chaos of systems via transgressive feminist practices. Suchman (2007) proposes we engage in situated actions for developing different kinds of human relations to perception, objectivity, understanding, and production. Her approach conjures planned, creative interactions that allow users to reflect on proscriptive instructions, reject responding to plans and context, and act in reflexive response to things generated.
This body of work employs diverse feminist and critical approaches to unpack the complexity of system operations, productively frame and reframe the notion and work of distributed logics and deploy it in productive disruptions. And yet, there is scope for greater academic exchange to disturb our notions of systems and the platforms that support them.
Kember and Zylinska’s Life after New Media (2012) calls on feminist scholars to bridge our diverse research on human-machine and human-system interfaces to enhance possibilities for critical interventions addressing automated citizen and consumer responses. There is much space to intertwine art and media analysis with STS, contrasting our intersectional approaches, and engaging with other disciplines (postcolonial studies, critical disability studies, Indigenous studies, etc.), and research from emerging and Global South scholars to refresh system analysis. In the spirit of feminist critique, these confrontations can agitate our assumptions, lingering disciplinary habits, and habituated citation practices, and encourage analysis of evolving platforms, to kindle new insights and research directions. This section represents research that seeks to understand and exploit system complexity, to put it into sharp relief, enabling us to respond inventively to the things systems produce.
This type of intervention is evident in feminist scholarship over the past decade that has disrupted theorizations of human-machine interaction within HCI, offering new methods and close readings of technologies and practices to generate novel research questions. Foundational work from Shaowen Bardzell (2010), for example, outlined key qualities of feminist HCI aligned with critical design approaches including pluralism, participation, advocacy, ecology, and reflexivity, and, with Elizabeth Churchill, probed the links between design, feminism, and social science that can support technology’s role in social change (Bardzell and Churchill 2011). Ann Light (2011) queered interaction with computers while José Abdelnour-Nocera et al. (2013) deployed local and Indigenous Peoples’ perspectives to reframe HCI methods. Nassim Parvin (2019) provides nuanced studies of how AI agents, embedded in personal assistant technologies, normalize and redeploy destructive forms of body bias. A recent special issue explores the use of interaction design in HCI to address diverse womens’ health issues (Almeida et al. 2020). Abundant work engages intersectional feminism to explore data science and ethics (D'Ignazio and Klein 2020), and how enhanced intersectional attention to the body can enhance critical analysis of women’s health in HCI research (Bellini et al. 2018), while Tamara Kneese and Beza Merid (2018) query how networked subjectivities arise in human-technology assemblages articulated to illness. As well, a recent Special Section of Catalyst brings fresh analysis to human interactions with activity trackers, deeply informed by intersectional approaches (Dolezal and Oikkonen 2021).
Feminists working with design practice have challenged traditional HCI scholarship approaches and assumptions. Rosanna Bellini et al. (2018), for example, reveals how feminist HCI research agendas—women’s health, design innovations, questions of inclusion and diversity and identity—can remain isolated. Laura Forlano (2017) employed speculative design approaches to foreground feminist principles and values, while building empathy, transparency and trust. Scholars have demonstrated the value of participatory design to enhance justice work in feminist HCI (Bardzell 2018) and interrogated the “co” in co-design (Choi and Light 2020), while Shruti Sai Chivukula and Colin Gray’s (2020) feminist HCI work discusses curating objects via “relatable” objects. Luke Stark and Kate Crawford (2019) have demonstrated how artists’ approaches have enabled distinct interrogations of the ethics engaged in data and machine learning systems.
Feminist HCI disruptions are evident in the emerging field of human-machine communication, which deploys critical and philosophical approaches to read technology as communicator to better understand human-computer, robot, and agent relations (Guzman 2018). The threads of formative STS work are evident here, reflected in methodological kinship with Suchman’s anthropological approach. As well, productive cross pollination is visible—Forlando’s (2017) intersectional and crip approaches to feminist data practice are complimented in human-machine communication by co-design methods aiming to build crip- and aging-friendly therapeutic platforms (Gardner et al. 2021). In putting diverse approaches into dialogue, as the above examples illustrate, this Special Section hopes to productively converge critical approaches to produce new disruptions in our feminist scholarship. Interferences, as we know, are essential in our shared mission to innovate human-machine interactions with attention to transparency, rigorous collaboration, and consent, and to yield human-machine relations grounded in and generating enhanced equity, diversity, accessibility, and justice.
Much feminist research has addressed how global media systems cloak the complexity of multiple information flows by broadcasting a coherent product to the lowest common denominator media subject, and, over time, this work has taken up critical race and postcolonial studies to deepen and strengthen analysis. Key formative ideas infuse this area of research, revealing, for example, the reductionism rampant in the work of media platforms that create problematic systems. As Hayles (1999) has shown, modular systems are often presented as representing the whole, simplifying system complexity—neurochemical systems in the brain, for example, are presented as a coherent representation of cognition. Proscriptions for automated behaviors with data are, of course, layered with intersectional impact. Industry routinely recommends that workers ideally self-regulate and surveil their productivity. Crucially, such proscriptions belie gendered, raced, classed, ethnic, age, ability, and sexuality assumptions that privilege particular bodies. Absencing this intersectional complexity normalizes the larger system framework that supports neoliberal, global capitalist ideals, and increases technoprecarity. Automation logics in mediated environments also ask users to accept assigned roles and dole out credit to asymmetrically favor culturally dominant subjects. In demanding obedience to the system’s logic, automation encourages users to assume their place in the system, and to produce desired outputs infused with normative gender, race, and class characteristics. Automated systems are expert at producing uncritical users who maintain the status quo—consumers of government policy and Ted Talks, obedient activity tracker consumers, subaltern class healthcare workers, compliant health consumers, and more.
Critical race, Indigenous, and postcolonial scholarship provide lenses and methodological blueprints that arm us with techniques to name, recognize, and redress the practices by which racial, class, and colonial-biased values are inscribed in knowledge and technology systems, their platforms, databases, and automated logics, in ways that are self-sustaining and reproductive. Important STS and communication scholars have deepened our feminist understandings of the discrete mechanisms by which we train our platforms to embody visibility (Buolamwini and Gebru 2018; Daniels 2019), fetishize code as a God trick (Chun 2009b), render race as code (Nakamura 2012), denaturalize Blackness, and fracture the body from its humanness into new subjects (Noble 2018; Browne 2015). Trailblazing work by Safiya Noble, Simone Browne, and Joy Buoloamwini reveals how algorithmic platform and database bias trains surveillance and AI tools to recognize only some faces, bodies, colors, and shades; to classify and surveil Blackness, and normalize whiteness across public spaces. Presciently, Browne (2015) declares this a new technology of branding, producing contemporary practices of spatial colonization, which is required, as Nakamura (2012) explains, to credibly demonize Blackness. Kim TallBear (2013) examines how mutually informing science and social systems denaturalize indigeneity. Her study of Native American DNA reveals how genetic science conflates the material (blood) and semiotic (race or tribe) in “markers” used to place Indigenous Peoples in distinct genetic categories. In this system, DNA becomes a potent metaphor and technology that works to denaturalize some Indigenous Peoples, with tragic consequences for land claims and sovereignty.
A distributed logic approach is evident in contemporary critical race work that hacks surveillance systems that reinforce racial and gender bias and perpetuate discriminatory practices of seeing and security (Buolamwini 2017; Noble 2018; Browne 2015). This body of work rejects the system-sanitizing turn to ethics and calls instead for systems to be rooted in social justice. Exposing the complex environments that produce platforms, and the discrete, often obscured manners in which platforms sort, categorize, and automate is crucial to imagining platforms that make possible equitable and sustainable futures. This work also calls out knowledge platforming within feminist scholarship as a key, lingering challenge, itself requiring disruption that decenters whiteness, class, and settler privilege and northern fetishization. TallBear and Sandoval offer methodological guides by which we infuse critical race studies into our methods. TallBear (2013) offers an objectivity in action model, typified by a “promiscuous” standpoint position by which we stand by and study across in order to co-constitute more ethical claims. Sandoval (2000) offers oppositional consciousness as method to confront academic colonialism, employing the politics and lived experience of “third-world feminism” to encourage greater mobility across discrete academic technologies of critique. As well, this methodology provides guidelines for moving beyond critique, to making theory actionable, and deploy it to confront oppression and recode tools of communication and intelligence. Drawing on this knowledge, this Special Section focuses on how automation, labeling, and categorization practices make visible the raced, gendered, sexuality, and otherwise biased environments that produce, deploy, and in turn reify system logics. The contributions illustrate how research that thoughtfully blends critical race, postcolonial, and queer approaches to critical systems analysis can generate evocative new systems understanding.
This Special Section incites feminist border crossings that challenge our current platform-enabled system analysis and create new possibilities for transgressing them, both in theory and in practice. The section is infused with reflections on the current moment where a global virus lays bare automated platform politics. Each article recognizes the platform as conditioned by biases that implicate the platform’s practice and its reproduction of bias, preference, and disparity. The authors get close to data in these studies, drawing on research experience, observation, and ethnography in deep, critical, and situated analyses. Each study appreciates the sociopolitical and cultural contexts that condition the limits and make possible the work of platforms with attention to the formative roles played by space and locality. The collection reveals the complex manners in which platforms naturalize and automate, necessitating complex analytic lenses that, in turn, open up possibilities for disruption and sustainable reinvention.
We hope these critiques and material engagements, in aiming to counter the constrained, modular, automated, proscriptions and predications of systems, will generate inventive forms of differential consciousness that render visible and activate flows across knowledge systems. In this effort, we hope to make possible the imagination and incantation of new revolutionary manifestations of the systems. The Special Section seeks to fortify us to better imagine and create platforms attendant to the politics of bias, the problems of automation, the will to invisibility, and to direct the wily, mutating capabilities of platforms to create sustainable global futures.
In “The Space between Us,” Wendy Hui Kyong Chun probes the racial politics of neighborliness deployed in “homophilic” health policy—assuming that similarity breeds connection—during the COVID-19 pandemic. Chun shows how policies denoting COVID-safe spaces (of like neighbors) reflect the ethnically sovereign spaces enforced by containing Japanese Americans in American internment camps (during the Second World War) and Indigenous Peoples in reservations. The current spatial politics of “making common” extends from land to communal networking practices, effectively “containing” ethnic identities via colonial, racist agendas while producing widespread anti-Asian bias and violence. Chun contends that the notion of the neighbor is precisely where we must intervene to enable future communities.
Beth Coleman continues the focus on infrastructure and the potential for ambiguity—or the spaces in between knowledge categories—to imagine a generatively wild AI where Black techné disrupts a colonial sublime and its cyber-servo-mechanistic regime. Coleman targets predictive machine learning as predicated on racist assumptions that, via automated categorizing and sorting practices, are reproduced in the AI “surround”— a command-control array of sensors and informatics. In deploying generative (unsupervised) machine learning, Coleman conceptualizes a “technology of the surround” that unbinds categorical logics, and liberates subjects from capture and enslavement, to inhabit not an artificial but an alien intelligence and intra-active subject events (rather than subject/objects), understood as an excess of possibilities.
In “Swapping Gender Is a Snap(chat),” an analysis of Snapchat’s gender swap program, Teddy Goetz challenges the algorithmic reinscription of reductive gender norms and, drawing on interviews with transgender, non-binary, and/or gender queer individuals, underlines the tension that exists between the need for external legibility and internal authenticity. Goetz argues that “for trans persons, navigating gendered legibility requires balancing authentic expression, societal legibility, and reinforcing hegemonic norms.” While calling for a system that could mix and match, and allowing space for ambiguity and unbounded exploration, Goetz offers a salutary reminder of the experiential struggle for change.
In her critique of healthcare-based AI in India, Radhika Radhakrishnan also underlines the social cost of technological experiments undertaken in the name of development and the social good. Attending to the bodies and medical records of the sick-poor, used as data to train proprietary AI systems with no state regulation, Radhakrishnan offers an important rejection of the claim to spectacular, quick-fix technologies. The study calls for the end of state complicity in the production of experimental subjects as distinct from subjects at liberty to experiment with co-constitutive systems.
Missing data in their research process propelled the hauntological analysis by Katrine Meldgaard Kjær, Mace Ojala, and Line Henriksen, in “Absent Data.” The authors unpack the meaningfulness of the silences and absences that are common components of automated data collection from social media or the internet. The study demonstrates how to center and employ absences as data or presence that co-construct the research process to better understand and make transparent relationships that can be located in data’s absences.
In the Commentary Section, a reflective piece from postcolonial communication scholar Radha Hegde titled “Pathogens, Precarity, and Digital Politics of Exclusion” offers insights into the race- and class-tipped protocols ascribing shelter during the COVID-19 pandemic in the United States and India. The piece situates this moment of lived experience across the globe as we witness how informational and connective logics, housed in infrastructures and institutions, produce housing and health precarity. Hegde reveals these as global practices, made possible by neoliberal and capital pressures that produce a politics of systemic exclusion.
The Image and Text section offers three practice-infused pieces deploying visual and multimodal interventions that probe the sensorial dimensions of systems and platforms, exposing problematic systemic predilections and offering provocations for safer and more sustainable futures.
Artist Shu Lea Cheang presents her provocative installation 3x3x6 from the Venice Biennale (2019) that reveals, reproduces, and reinvents how gender and sexuality norms are entrenched in carceral and public spaces. The image/text piece, using visual parallax form, moves participants through elements of the whimsical installation, visiting the video-refitted prison space (once housing famed lover Casanova), staged transpunk photos, digital panoptic interfaces, and “fucked up” avatar face tracking. Accompanied by comments from curator Paul Preciado and an interview with Paula Gardner, the experience invites us to imagine a future ungoverned by gender, sex, and other binaries impositions. The cover of this issue also features a piece from 3x3x6.
Jennifer Willet discusses the ethos of her Incubator lab, at University of Windsor, Ontario, Canada, a collaborative and community-engaged bioart lab that invites deep engagement with scientific methods and biomatter. Lab collaborators work to render biological life accessible, in keeping with the life-affirming work of feminist, postcolonial perspectives and supporting Indigenous ways of knowing. Willet frames the lab’s unique work—taking the form of parades, cultivated bacteria, and bureaucratic risk management critiques—as reflecting and cultivating love for all life communities.
Hayri Dortdivanlioglu maps the smart sensors and CCTV cameras deployed in Atlanta's smart city project, SmartATL, placed in North Avenue Smart Corridor, in a notably Black neighborhood. The mapping makes visible the (visually obscured) sensor devices and the all-encompassing surveillance achieved by algorithms that coordinate them. The potent visual maps are deployed to help us recognize and push back against the surveillance gaze and its uneven impacts on racialized, peripheralized subjects.
This collection of text and practice-based work reflects diverse global understandings and visionings of the platform and the system. Each article reflects a snapshot of this pandemic moment in which we live, write, and make, providing everyday life context that is crucial to understanding system constraints and reproductive problematics, and to imagining different human-technology futures. This moment has newly pressured us all to account for the gendered and raced inequities and disparities produced by white supremacy and settler colonialism that is validated and perpetuated in our systems and platforms. We hope that as a collection born of this moment and infused with a critical race and postcolonial lens, the work will deepen our feminist understandings of how the complex ideological ground of platforms normalize and sustain systemic bias. The pieces offer potent methods of critique that divert the gaze (Coleman 2009; Browne 2015), reject racial coding, probe contemporary practices of governance and branding, and deploy these to ideate humane technology platforms in service to sustainable futures. It is our hope that in colliding communication, STS, HCI, digital humanities, and other fields, we create opportunities to recognize feminist divergences as opportunities for productive disruption. Thank you for spending time with the Special Section.
References
Abdelnour-Nocera, José, Torkil Clemmensen, and Masaaki Kurosu. 2013. “Reframing HCI through Local and Indigenous Perspectives.” International Journal of Human-Computer Interaction 29 (4): 201–4. https://doi.org/10.1080/10447318.2013.765759.
Almeida, Teresa, Madeline Balaam, Shaowen Bardzell, and Lone Koefoed Hansen. 2020. “Introduction to the Special Issue on HCI and the Body: Reimagining Women’s Health.” ACM Transactions on Human-Computer Interaction 27 (4): 1-32. https://doi.org/10.1145/3406091.
Anzaldúa, Gloria. (1987) 2012. Borderlands/La Frontera: The New Mestiza. San Francisco: Aunt Lute Books.
Barad, Karen. 2007. Meeting the Universe Halfway: Quantum Physics and the Entanglement of Matter and Meaning. Durham, NC: Duke University Press
Bardzell, Shaowen. 2010. “Feminist HCI: Taking Stock and Outlining an Agenda for Design.” Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, April, 1301–10. https://doi.org/10.1145/1753326.1753521.
———. 2018. “Utopias of Participation: Feminism, Design, and the Futures.” ACM Transactions on Computer-Human Interaction 25 (1): 1–24. https://doi.org/10.1145/3127359.
Bardzell, Shaowen, and Elizabeth F. Churchill, eds. 2011. “Feminism and HCI: New Perspectives.” Special issue, Interacting with Computers 23 (5): 385–564. https://doi.org/10.1016/S0953-5438(11)00089-0.
Bellini, Rosanna, Angelika Strohmayer, Ebtisam Alabdulqader, Alex A. Ahmed, Katta Spiel, Shaowen Bardzell, and Madeline Balaam. 2018. “Feminist HCI: Taking Stock, Moving Forward, and Engaging Community.” CHI EA ‘18: Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems, 1–4. https://doi.org/10.1145/3170427.3185370.
Benjamin, Ruha. 2019. Race after Technology: Abolitionist Technologies for the New Jim Code. Medford: Polity Press.
———. 2020. “Viral Justice: The Pandemic, Policing and Portals.” Streamed live on July 16, 2020. YouTube. https://www.youtube.com/watch?v=9txyFSphrT8.
Browne, Simone. 2015. Dark Matters: On the Surveillance of Blackness. Durham: Duke University Press
Buolamwini, Joy. 2017. “Gender Shades: Intersectional Phenotypic and Demographic Evaluation of Face Datasets and Gender Classifiers.” Master’s thesis. https://www.media.mit.edu/publications/full-gender-shades-thesis-17/.
Buolamwini, Joy, and Timnit Gebru. 2018. “Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification.” Proceedings of Machine Learning Research 81 (1): 1–15. http://proceedings.mlr.press/v81/buolamwini18a/buolamwini18a.pdf.
Chivukula, Shruti Sai, and Colin M. Gray. 2020. “Bardzell’s ‘Feminist HCI’ Legacy: Analyzing Citational Patterns.” Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems, 1–8. https://doi.org/10.1145/3334480.3382936.
Choi, Jaz Hee-jeong, and Ann Light. 2020. “‘The Co-’: Feminisms, Power and Research Cultures: A Dialogue.” Interactions 27 (6): 26–28. https://doi.org/10.1145/3429697.
Chun, Wendy Hui Kyong. 2009a. “Introduction: Race and/as Technology: Or, How to Do Things to Race.” Camera Obscura 24 (1 (70)): 7–35. https://doi.org/10.1215/02705346-2008-013.
———. 2009b. Programmed Visions: Software and Memory. Cambridge, MA: MIT Press.
Coleman, Beth. 2009. “Race as Technology.” Camera Obscura 24 (1 (70)): 177–207. https://doi.org/10.1215/02705346-2008-018.
Collins, Patricia Hill. 1986. “Learning from the Outsider Within: The Sociological Significance of Black Feminist Thought.” Social Problems 33 (6): 14–32. https://doi.org/10.2307/800672.
Combahee River Collective. (1977) 2001. “The Combahee River Collective Statement.” In Available Means: An Anthology of Women’s Rhetoric(s), edited by Joy Ritchie and Kate Ronald, 292–300. Pittsburgh: University of Pittsburgh Press.
Daniels, Jessie. 2009. “Rethinking Cyberfeminism(s): Race, Gender, and Embodiment.” Women’s Studies Quarterly 37 (1–2): 101–24. http://doi.org/10.1353/wsq.0.0158.
D’Ignazio, Catherine, and Lauren F. Klein. 2020. Data Feminism. Cambridge, MA: MIT Press.
Dolezal, Luna, and Venla Oikkonen. 2021. “Self-Tracking, Embodied Differences, and Intersectionality.” Catalyst: Feminism, Theory, Technoscience 6 (1): 1–15. https://doi.org/10.28968/cftt.v7i1.35273.
Firestone, Shulamith. 1970. The Dialect of Sex: The Case for Feminist Revolution. New York: William Morrow and Company, Inc.
Forlano, Laura. 2017. “Data Rituals in Intimate Infrastructures: Crip Time and the Disabled Cyborg Body as an Epistemic Site of Feminist Science.” Catalyst: Feminism, Theory, Technoscience 3 (2): 1–28. https://doi.org/10.28968/cftt.v3i2.28843.
Gardner, Paula, Stephen Surlin, Adekunle Akinyemi, Jessica Rauchberg, Caitlin McArthur, Yujiao Hao, Rong Zheng, and Alexandra Papaioannou. 2021. “Designing a Dementia-Informed, Accessible, Co-Located Gaming Platform for Diverse Older Adults with Dementia, Family and Carers.” In HCII 2021, LNCS 12787, edited by Q. Gao and J. Zhou. Springer Nature Switzerland AG 2021. https://doi.org/10.1007/978-3-030-78111-8_4.
Grosz, Elizabeth. 2005. Time Travels: Feminism, Nature, Power. Durham, NC: Duke University Press.
Guzman, Andrea, ed. 2018. Human-Machine Communication: Rethinking Communication, Technology, and Ourselves. New York: Peter Lang.
Haraway, Donna. 1988. “Situated Knowledges: The Science Question in Feminism and the Privilege of Partial Perspective.” Feminist Studies 14 (3): 575–99. https://doi.org/10.2307/3178066.
———. 1990. Simians, Cyborgs, and Women: The Reinvention of Nature. New York: Routledge.
Hayles, Katherine N. 1991. Chaos and Order: Complex Dynamics in Literature and Science. Chicago: University of Chicago Press.
———. 1999. How We Became Posthuman: Virtual Bodies in Cybernetics, Literature and Informatics. Chicago: University of Chicago Press
hooks, bell. (1984) 2015. Feminist Theory: From Margin to Center. New York: Routledge.
Kember, Sarah. 2013. “Gender Estimation in Face Recognition Technology: How Smart Algorithms Learn to Discriminate.” Media Fields Journal 7, 1–10. http://mediafieldsjournal.squarespace.com/gender-estimation-in-face-reco/.
Kember, Sarah, and Joanna Zylinska. 2012. Life after New Media: Mediation as a Vital Process. Cambridge, MA: MIT Press.
Kneese, Tamara, and Beza Merid. 2018. “Introduction: Illness Narratives, Networked Subjects and Intimate Publics.” Catalyst: Feminism, Theory, Technoscience 4 (1): 1–6. https://doi.org/10.28968/cftt.v4i1.29627.
Kumar, Sangeet, and Radhika Parameswaran. 2018. “Charting an Itinerary for Postcolonial Communication and Media Studies.” Journal of Communication,68 (2): 347–58. https://doi.org/10.1093/joc/jqx025.
Light, Ann. 2011. “HCI as Heterodoxy: Technologies of Identity and the Queering of Interaction with Computers.” Interacting with Computers 23 (5): 430–38. https://doi.org/10.1016/j.intcom.2011.02.002.
Lorde, Audre. (1984) 2018. The Master's Tools Will Never Dismantle the Master’s House. New York: Penguin Modern.
Munster, Ana. 2006. Materializing New Media: Embodiment in Information Aesthetics. Hanover, NH: Dartmouth College Press.
Nakamura, Lisa. 2012. “Queer Female of Color: The Highest Difficulty Setting There Is? Gaming Rhetoric as Gender Capital.” Ada: Journal of Gender, New Media and Technology 1.
Noble, Safiya U. 2018. Algorithms of Oppression: How Search Engines Reinforce Racism. New York: NYU Press.
Parvin, Nassim. 2019. “Look up and Smile!: Seeing through Alexa’s Algorithmic Gaze.” Catalyst: Feminism, Theory, Technoscience 5 (1): 1–11. https://doi.org/10.28968/cftt.v5i1.29592.
Sandoval, Chela. 2000. Methodology of the Oppressed. Minneapolis: University of Minnesota Press.
Stark, Luke, and Kate Crawford. 2019. “The Work of Art in the Age of Artificial Intelligence” Surveillance & Society 17 (3–4): 442–55. https://doi.org/10.24908/ss.v17i3/4.10821.
Suchman, Lucy. 2007. Human-Machine Reconfigurations. Cambridge: Cambridge University Press.
TallBear, Kim. 2013. Native American DNA: Tribal Belonging and the False Promise of Genetic Science. Minneapolis: University of Minnesota Press.
Author Bios
Paula Gardner is Associate Professor and Asper Chair at McMaster University, Ontario. She runs Pulse Lab at McMaster, which engages collaborative practice to innovate art and health technologies with community for social change.
Sarah Kember is Professor of New Technologies of Communication at Goldsmiths, University of London and Director of Goldsmiths Press.