Algorithmic Glass Ceilings and Gendered Echo Chambers: “Bias Amplification” in Social Networks

A network, illustrated by dots in multiple colors with linkes connecting them and some circlesA pair of recent papers highlights how today’s social networks not only reflect societal biases, but can actually amplify them.

Ana-Andreea Stoica et. al.’s Algorithmic glass ceiling in social networks: the effects of social recommendations on network diversity looks at the effect of “social recommendations” such as friend suggestions and people to follow, both at the theoretical level and empirically on Instagram.   The authors find that “prominent social recommendation algorithms can exacerbate the under-representation of certain demographic groups at the top of the social hierarchy.”  More specifically:

Our mathematical analysis demonstrates the existence of an algorithmic glass ceiling that exhibits all the properties of the metaphorical social barrier that hinders groups like women or people of colour from attaining equal representation.

One would a priori expect similarity metrics, usually the basis of recommender systems, to contribute to sustaining disparities among various groups. We show much more: using empirical evidence from newly collected data on Instagram and a rigorous analysis of mathematical models, we prove that prominent recommender algorithms reinforce the rate at which disparity grows.

The first couple of sections of the paper are a quick read, after which it gets into some heavy-duty math.   Fortunately, Kim Martineau’s How Social Networking Sites May Discriminate Against Women on Columbia News, is a good summary; and Adrian Collyer, on the ACM’s The morning paper, walks through the paper in detail.

The underlying dynamic here of homophilypeople’s tendency to prefer to interact with people similar to themselves — isn’t new.  Neither is the idea of a “glass ceiling” in social media,*  or realization that algorithmic recommendations reflect societal biases.**   What’s important about this paper is both the formal model and the experimental results showing bias amplification.

Meanwhile, Nikki Usher et al‘s Twitter Makes It Worse: Political Journalists, looks at “beltway journalists’ peer-to-peer relationships on Twitter—or how journalists use the platform to legitimate, amplify, and engage each other,” and similarly finds substantial evidence of gender bias.  In particular:

Most alarming is that male journalists amplify and engage male peers almost exclusively, while female journalists tend to engage most with each other.  The significant support for claims of gender asymmetry as well as evidence of gender silos are findings that not only underscore the importance of further research but also suggest overarching consequences for the structure of contemporary political communication.

Hey wait a second, I’m noticing a pattern here!

 


* see for example Susan Herring et. al.’s classic 2003 paper Women and children last: the discursive construction of Weblogs and Shirin Nilizadeh et. al.’s 2016 Twitter’s Glass Ceiling: The Effect of Perceived Gender on Online Visibility.

** recent books like Dr. Safiya Umoja Noble’s Algorithms of Oppression: How Search Engines Reinforce Racism  and Virginia Eubanks’ Automating Inequality have plenty of examples; here’s a 2011 post from me focusing on TechMeme‘s recommendation algorithms

 

Sex, pleasure, and diversity-friendly software: the article the ACM wouldn’t publish

Sex, pleasure, and diversity-friendly software was originally written as an invited contribution to the Human to Human issue of XRDS: Crossroads, the Association of Computing Machinery’s student magazine.  After a series of presentations on diversity-friendly software, it seemed like an exciting opportunity to bring broaden awareness among budding computer scientists of important topics that are generally overlooked both in university courses and the industry.

Alas, things didn’t work out that way.

Overriding the objections of the student editors, and despite agreeing that the quality of the work was high and the ideas were interesting, the ACM refused to publish the article. The ACM employees involved were all professional and respectful, and agreed on the importance of diversity.  Still, due to concerns about discussions of sex and sexuality offending ACM subscribers and members, they would not even consider publishing a revised version.

The CHI paper What’s at Issue: Sex, Stigma, and Politics in ACM Publishing (authored by Alex Ahmed, Judeth Oden Choi, Teresa Almeida, Kelly Ireland, and me) explores some of the underlying institutional and sociopolitical problems this episode and others involved in editing the Human to Human issue highlights, and proposes starting points for future action for HCI-related research and academic publishing practices.

This revised version of Sex pleasure, and diversity-friendly software is written as a companion piece to What’s at Issue. After a brief background section, it includes extended (and lightly-edited) excerpts from the earlier version of the article, and my reflections on the experience and the opportunities it highlights for software engineering. An appendix includes a brief overview of diversity-friendly software along with links to more detailed discussions.

Continue reading Sex, pleasure, and diversity-friendly software: the article the ACM wouldn’t publish

Gender HCI, Feminist HCI, and Post-Colonial Computing

Emma Willard’s Temple of Time (1846)

Last major update, October 2018*

For years, I’ve asked software engineers and designers I run into at conference if they know about gender HCI (human-computer interaction), feminist HCI, or post-colonial computing. More recently, I’ve added intersectional HCI, anti-oppressive design, and design justice to the list as well. The response is usually something along the lines of “sounds interesting, but never heard of it.”

Which is a shame. These fields have some great insights about how to create software that works better for everybody. A very brief overview:

  • Gender HCI focuses on the differences in how different genders interact with computers
  • Feminist HCI is concerned with the design and evaluation of interactive systems that are imbued with sensitivity to the central commitments of feminism — agency, fulfillment, identity and the self, equity, empowerment, diversity, and social justice.
  • Post-colonial Computing centers on the questions of power, authority, legitimacy, participation, and intelligibility in the contexts of cultural encounter, particularly in the context of contemporary globalization
  • Intersectional HCI is a framework for engaging with the complexity of users’ and authors’ identities, and situating these identities in relation to their contextual surroundings.
  • Anti-oppressive design “considers both the values embedded in technological design and the environment that surrounds how a technology is built and researched.”
  • Design justice focuses on the ways that design reproduces, is reproduced by, and/or challenges the matrix of domination (white supremacy, heteropatriarchy, capitalism, and settler colonialism), and is also a growing social movement

At some level it’s not surprising that this work isn’t as well known as it should be. Much of this work has been done by women of color, queer and trans people, and others who are marginalized within the tech world. Much of this is heavily influenced by the social sciences, which are also marginalized by tech. And much of this work is also by the kinds of biases against that anti-oppressive research that Alex Ahmed, Judeth Oden Choi, Teresa Almeida, Kelly Ireland, and I discuss in What’s at Issue: Sex, Stigma, and Politics in ACM Publishing.**

So here are some slightly longer overviews of these different areas, each featuring a handful of key papers, along with a few videos. As you read this, Like any literature survey, what’s here is filtered through my background and interests; if there’s other work that you think should be here, please let me know!

Continue reading Gender HCI, Feminist HCI, and Post-Colonial Computing