Diversity-friendly software

Sex, pleasure, and diversity-friendly software: the article the ACM wouldn’t publish

Sex, pleasure, and diversity-friendly software was originally written as an invited contribution to the Human to Human issue of XRDS: Crossroads, the Association of Computing Machinery’s student magazine.  After a series of presentations on diversity-friendly software, it seemed like an exciting opportunity to bring broaden awareness among budding computer scientists of important topics that are generally overlooked both in university courses and the industry.

Alas, things didn’t work out that way.

Overriding the objections of the student editors, and despite agreeing that the quality of the work was high and the ideas were interesting, the ACM refused to publish the article. The ACM employees involved were all professional and respectful, and agreed on the importance of diversity.  Still, due to concerns about discussions of sex and sexuality offending ACM subscribers and members, they would not even consider publishing a revised version.

The CHI paper What’s at Issue: Sex, Stigma, and Politics in ACM Publishing (authored by Alex Ahmed, Judeth Oden Choi, Teresa Almeida, Kelly Ireland, and me) explores some of the underlying institutional and sociopolitical problems this episode and others involved in editing the Human to Human issue highlights, and proposes starting points for future action for HCI-related research and academic publishing practices.

This revised version of Sex pleasure, and diversity-friendly software is written as a companion piece to What’s at Issue. After a brief background section, it includes extended (and lightly-edited) excerpts from the earlier version of the article, and my reflections on the experience and the opportunities it highlights for software engineering. An appendix includes a brief overview of diversity-friendly software along with links to more detailed discussions.

Background

The first version’s “background” section with intended as a quick overview for students and references for those who wanted to find out more.   As four-paragraph intros go, it was decent enough, with some useful links including Lynn Shore et. al.’s Journal of Management paper “Inclusion and diversity in work groups: A review and model for future research“, Candice Morgan’s Harvard Business Review article What we learned from improving diversity rates at Pinterest (a good overview of learnings from a program that incorporates many current best practices), and a resource page with definitions and videos on intersectionality.

Still, there are much richer discussions of diversity in technology out there.   For example:

“Diversity” itself is an interesting subject. Dafina-Lazarus Stewart’s observation in Language of Appeasement that “by substituting diversity and inclusion rhetoric for transformative efforts to promote equity and justice, colleges have avoided recognizable institutional change” applies to the tech world and corporate America as well. That said, many people working on diversity and inclusion in technology are also fiercely committed to equity and justice.  It would be really great to have a term for “intersectional diversity, inclusion, equity, and justice with an explicit anti-racist, anti-sexist, trans-inclusive, LGBTQIA+ forward, accessible, anti-colonial stance”; Jill Dimond’s Anti-Oppressive Design is one promising direction.

Sara Ahmed’s work is valuable for anybody working on diversity-related topics.   On Being Included: Racism and Diversity in Instituational Life is based on qualitative research, including semi-structured interviews and “ethnographic material derived from my own experience from working in what we could call simply ‘the diversity world.’”  Her blog Feminist Killjoys and videos like Complaint: Diversity Work, Feminism, and Institutions are excellent introductions.

 

 

The Association for Computing Machinery (ACM) is the world’s largest professional computing society, bringing together “together computing educators, researchers, and professionals to inspire dialogue, share resources, and address the field’s challenges.”   What’s at Issue provides context on recent issues in ACM publishing including student editor reactions to running an ad from the NSA in 2014,  Communications of the ACM’s November 2016 cover and article “Sex as an Algorithm”, and Marie desJardins’ critique.

 

Excerpts from the earlier version

Introduction

Most software today reinforces existing societal power dynamics and works best for people who are similar to the teams who created it. Given the demographics and biases of the software industry, today’s dynamics tend to leave women, gender-diverse people, people of color, disabled people, and many others out of the equation.

People and communities create software, whch in turn empowers peeople and communities

Diversity-friendly software, by contrast, is intentionally designed for a diverse user base.  Many techniques for diversity-friendly software such as accessibility, gender HCI (human-computer interaction), and flexible and optional self-identification are backed by solid research and practical experience. For the most part, though, these techniques are not yet broadly practiced in the industry.

Software relating to sex and pleasure education is an interesting microcosm of the broader industry.  Sex has long been a major driver for online innovation; streaming video, for example, was first introduced by a Dutch porn company in 1994.   Today, sexual wellness is a huge market, and porn companies are looking to move into the area.  Peggy Orenstein critiques one such offering as “almost like a white-washing scam to justify the kind of anti-female pleasure, misogynist, distorted sexuality that often eroticizes humiliation, that’s devoid of intimacy, and at best mis-represents female pleasure”.  From a software perspective, porn platforms focus on commodification and the needs of cis males.

Sites like Scarleteen, OMGYes, O.school, and Make Love Not Porn, by contrast are led by women, trans, and non-binary people, and take a much broader and more inclusive view of their audience. Scarleteen was created and built based on what young people ask for, and its philosophy starts with “A foundation of equality, respect, dignity, fairness, consent, liberty, freedom of thought and expression and other core human rights.”  OMGYes provides “knowledge for women and partners” and takes a science-based approach.   O.school’s initial alpha test covered topics including “Negotiating Consent While Living with a Mental Illness” and “Healing from Religious Shame”, “Why Pleasure Matters”. and “Sexy Safe Sex”.   Make Love Not Porn’s Cindy Gallop says “Everything we do is purely to make it easier to talk openly and honestly about sex in a public domain.”

Software to support intimate spaces to discuss these intensely personal topics requires some very different priorities.

Software embeds biases

Since software is designed, written, and tested by people, it’s scarcely surprising that the current and historical diversity challenges have embedded themselves in the software itself.   Only rarely do developers intentionally insert biases into the software.  Instead, it usually happens unconsciously.

One excellent example of this pattern is Web Accessibility, making web sites and applications usable for people with a diverse range of hearing, movement, sight, and cognitive ability. The original HTML specifications didn’t take accessibility into account. As a result, even though standards have evolved, accessibility is still treated as an afterthought. It requires additional expertise and effort to create web pages that support screen readers or mouseless navigation. These skills are not generally taught in undergraduate courses or coding schools, and many companies do not invest the resources to make their software accessible.

Algorithms are another important source of bias.  MIT grad student Joy Buolamwini, founder of the Algorithmic Justice League, explains why facial recognition software tends to have a harder time recognizing black faces:

“Computer vision uses machine-learning techniques to do facial recognition. You create a training set with examples of faces. However, if the training sets aren’t really that diverse, any face that deviates too much from the established norm will be harder to detect.”

Another high-profile case of algorithmic bias was reported by Julia Angwin et. al. in their Pulitizer Prize-nominated Machine Bias series on Pro Publica:  “There’s software used across the country to predict future criminals. And it’s biased against blacks.”

Social networks provide other examples of bias.  Harassment and threats of violence primarily target women and gender-diverse people – especially women and gender-diverse people of color.  Twitter essentially ignored this issue for years, and their more recent attempts to do something about it have been remarkably unsuccessful.  Facebook’s moderation disproportionately penalizes activists of color.

Improving the diversity of the teams creating the software, and creating a more inclusive and equitable culture and environment, is one approach to reducing biases in software. More diverse teams will naturally tend to consider more dimensions of diversity.  If a team developing facial recognition software has Black engineers, they’re likely to notice the absence of black faces in their data set – or test the software on pictures of themselves and discover that it doesn’t work. Similarly, if a team developing social network software includes women of color activists who have been targeted by harassers, they’re likely to pay more attention up front to moderation features and other defenses against harassment, and have a better understanding of the problems they’re trying to solve.

While improving diversity, inclusion, and equity in the software industry is an important priority, by itself it is not enough. As mentioned above, industry’s progress on this front been extremely slow.  Not only that, there are so many different dimensions to diversity that any team will have gaps; and most software today is built largely from existing components – which embed biases.

A complementary approach is to apply design and software engineering techniques that focus on diversity.

Conclusion

More examples of startups building diversity-friendly applications include:

  • Thurst, the first dating app for queer people of all genders, prioritizes safety and community accountability above normative dating culture.
  • Atipica’s talent and diversity intelligence solutions take a personalized and empathetic approach using data to guide teams through traditionally difficult conversations around diversity and inclusion
  • Blendoor’s merit-based matching is technology for hiring that reduces unconscious bias
  • Nametag, a platform for building relationships, takes inspiration from offline organizing tactics that work for building relationships and building trust
  • Textio’s augmented writing platform uses analytics to help teams find more qualified – and more diverse – candidates

Not so coincidentally, these companies are led by women of color, asexsual actvists, trans and non-binary people,  and others who have traditionally been underrepresented in the software industry.

Looking further to the future, imagine a new software stack designed by diverse teams working with diverse communities, with an explicit goal of countering different dimensions of systemic oppression.  What would an intersectional feminist programming language look like?  A queer programming language?  How will software platforms, tools, protocols, and libraries evolve?

As software “eats the world” – and increasingly defines the power vectors and distribution of wealth in our society – it’s more important than ever that we consciously design and implement it in a way that empowers everybody.

Diverse, Inclusive People and Communities create software that embeds diversity and in turn empowers the diverse people and communities who created it

Reflections

Sex, pleasure, and diversity-friendly software very intentionally focuses on amplifying the voices — and highlighting the perspectives and contributions — of an intersectionally diverse group of people in a technical context.  This is a relatively rare and very important complement to the representational, cultural, and experiential discussions of diversity usually found in ACM publications.   It’s also a small step at countering acknowledged barriers to diversity in tech like a lack of awareness of successful role models for students from under-represented backgrounds, and stereotypes that associate software skill solely with white and Asian men.

So the ACM’s decision not to publish this article was a classic example of a point made in the article’s first paragraph:

Given the demographics and biases of the software industry, today’s dynamics tend to leave women, gender-diverse people, people of color, disabled people, and many others out of the equation.

How meta!

To be clear, the ACM didn’t make the decision not to publish with an explicit anti-diversity goal. The decision was driven by concerns about discussions of sex and sexuality offending ACM subscribers, members, and funders.  As What’s at Issue points out, “challenging oppressive systems will be naturally considered offensive under them.”

Similar patterns occur elsewhere in academic publishing.  Colleen Flaherty’s IEEE in trouble once again for allegedly minimizing work of female historians quotes Sarah T. Roberts (who has just been awarded a Carnegie Fellowship for her research on information work and information workers):

Roberts said that women and other historically marginalized groups whose “pathbreaking scholarship and very identities challenge the status quo find themselves frequently in the multiple binds of having their contributions minimized — unless and until that work attains a certain popular acceptance.”

What’s at Issue discusses some of the ramifications these patterns have on academic research.

The effects also impact the tech startup ecosystem. Being mentioned in an article in a publication from a well-respected organization like ACM (or IEEE) is an important source of credibility and legitimacy. Skilled founders like the ones whose companies are mentioned here can use that credibility as an asset in getting media coverage and even funding; technical employees at those companies can casually bring it up in future job interviews. “We were mentioned in an article that didn’t get published but turned into a really interesting case study about systemic oppression” isn’t as useful.

“Technologies reflect the biases of the makers and implicit rules of society” - Malkia Cyril at Personal Democracy Forum

Software as it is today – and the effect it has on the rest of the world – has largely been shaped by these kinds of patterns.  A huge amount of effort (and zillions of dollars) goes to improving ad targeting, unethical tracking, and blockchain; almost none on countering harassment or providing accessibility.  Sites like Stack Overflow reward arrogance and shaming and exclude women and people of color.  Y Combinator’s founder Paul Graham has a history of sexist statements; Y Combinator’s Hacker News discussion site is known for misogyny, nativism, and suppressing discussions of diversityOnly 0.2% of venture-funded founders are Black women.  Facebook’s “move fast and break things” philosophy leads to massive privacy breaches and stolen elections, while their failure curb hate speech against Muslims contributes to violence in Sri Lanka and Myanmar.  The list goes on …

The good news, though, is that there are more and more people who understand this and are working to counter it.    The alt.chi track in the CHI conference provides a venue for work like What’s at Issue to critique accepted practices and challenge power and privilege.  Project Include provides detailed resources for companies that are trying to be more inclusive.   In response to the RSA security conference’s initial all-male speaker list, the community quickly organized the extremely diverse OurSA conference.  Recent books like Safiya Umoya Noble’s Algorithms of Oppression, Virginia Eubanks’ Automating Inequality, and Sara Wachter-Boettcher’s Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech  – and the publicity associated with them (like Google Has a Striking History of Bias Against Black Girls in Time Magazine) contributes to even broader awareness.

One way to look at this whole episode is as a minor skirmish in a scientific revolution.  As an  intersectionally diverse group of rebels fights to transform and decolonize the white- and male-dominated, cis-normative, anti-disability, Western-centric power structures, institutions tend to push back.

Still, I’m happy to be on the side of the rebels.  As I said eighteen months ago in It’s Time to Double Down on Diversity and Inclusion

The world’s ready for a new approach to software, one that embraces differences and sees diversity as a strength…. The diversity-in-technology community can be a key part of a multiracial, intersectionally diverse, international, transpartisan alliance of people who want to work together to change things — in tech, and more broadly.

Our goal is to build the kind of world we want to live in. For me — and hopefully for you as well — diversity and inclusion are at the center of the future we’re creating.

Diverse, Inclusive People and Communities create software that embeds diversity and in turn empowers the diverse people and communities who created it

 

Image credits: photo of Malkia Cyril at PDF by originally tweeted by @anxiaostudio (an xiao mina)other images from Supporting Diversity with a New Approach To Software, with Tammarrian Rogers.

 

Appendix: Techniques for diversity-friendly software

Historical and current biases in software engineering and computer science are now embedded in the tools and processes we use to build software – and the building blocks that we use to create complex systems. How can we do better?

The earlier version of Sex, pleasure, and diversity-friendly software included a discussion about how several diversity-friendly software techniques – a robust code of conduct, pseudonymity and support for self- determination of gender pronouns, effective chat moderation, and threat modeling for harassment – applied to sex and pleasure education software.  As with the background section, it made sense in the context of the article.  As What’s at Issue notes,

These techniques are not generally discussed in undergraduate computer science curricula, nor are they practiced widely in industry…. We were looking forward to it being read widely, not because it is necessarily the “correct” way of designing this system (as one of our reviewers aptly pointed out, erasing names through pseudonymity has particular consequences for indigenous ways of knowing). Rather, we believe that introducing this system to the academic community would open the possibility for its adoption, rejection, debate, or modification.

Oh well, maybe next time.  Fortunately, other resources cover similar ground in more detail (albeit without the visibility of XRDS: Crossroads or the imprimatur of appearing in an academic publication).  For example:

There’s a substantial amount of related work out there. Algorithmic bias in particular has gotten a lot of attention these days; Emily Drabinski’s Ideologies of Boring Things: The Internet and Infrastructures of Race (a review of Safiya Umoya Noble’s book Algorithms of Oppression), Cordelia Fine’s Coded prejudice: how algorithms fuel injustice (a review of Algorithms of Oppression and Virginia Eubanks’ Automating Inequality), and Ian Tucker’s ‘A white mask worked better’: why algorithms are not colour blind (an interview with Joy Buolamwini) are good starting points.   On pseudonymity, J. Nathan Mathis’, The Real Name Fallacy  on The Coral Project and the Geek Feminism Wiki’s Who is Harmed by a Real Names Policy should be mandatory reading for anybody designing software that relates to identity.

A few other links especially worth highlighting: