“Though designers and developers may set out with the best intentions, implicit biases and a lack of foresight generates problems—from racist bots to transphobic dating apps.” Dr. Anna Lauren Hoffman
By Dr. Anna Lauren Hoffmann, UC Berkeley School of Information
The consensus in the startup world is that diversity and equality are mission critical to an individual company’s success and the future of tech as a whole. Still, tech companies continue to fail in recognizing and supporting diverse bodies and identities.
The problem of diversity in tech is partly one of diversifying the workforce—not just by hiring more women and people from populations underrepresented in tech, but supporting them and making sure they are represented in leadership positions.
But while hiring and representation is no doubt important, it is equally important that we recognize the many other ways diversity is relevant to the design and development of products, platforms, and services, as well as recognize and supporting diverse users. Though designers and developers may set out with the best intentions, implicit biases and a lack of foresight generates problems—from racist bots to transphobic dating apps.
Take, for example, recent controversies surrounding Amazon Prime’s same-day delivery service. Though the company prides itself on a certain kind of equality—they readily note that demographic data like race and ethnicity do not figure into how they distribute their goods and services—the roll-out of their same-day services have, however inadvertently, doubled-down on established racial and ethnic divides in certain areas. Until very recently, for example, same-day delivery service in Boston excluded the predominantly Black neighborhood of Roxbury while simultaneously serving neighborhoods on all sides of it.
From the perspective of any individual company, cases like the Amazon has faced appear to be anomalistic—they’re characterized as “edge cases” or exceptions. But they’re only anomalies from a certain perspective (usually one that centers white, usually male, almost always cisgender and able-bodied) considered “normal.” From the perspective of oppressed or minority groups, however, having to struggle against systems and platforms that do not account for their particular identities and needs is no anomaly—it has long been the norm.
Put another way: the case may be anomalous from Amazon’s point of view, but it’s far from anomalous for residents of underserved neighborhoods across the country. For Amazon, their same-day delivery service wasn’t intentionally racist. But for residents of predominantly Black neighborhoods that are consistently and systematically underserved by both private companies and public institutions, this case and part and parcel of a system of racial oppression.
Importantly, it’s not just automated or algorithmically-driven systems that cause these problems—things like “real name” policies or a hands-off attitude towards harassment work to further disempower already vulnerable users.
These points aren’t always obvious to people who fit into dominant groups, since—more often than not—dominant identities and needs are accounted for in the design and development of particular platforms or systems. Those of us that don’t fit easily into the dominant mode often have to struggle to gain access or make certain tools work for us. But just because it isn’t obvious doesn’t mean it’s not vital. As danah boyd recently put it:
“I don’t care what your politics are. If you’re building a data-driven system and you’re not actively seeking to combat prejudice, you’re building a discriminatory system.”
Speaking as such a person, and also as someone who works with and educates folks going out into the tech world, here are five steps I believe startup and tech leaders can take early on to start better recognizing problems of diversity more broadly. They aren’t going to fix all problems, but they’re a start.
- Recognize your own subjectivity – and that of those around you. If your team members share similar identities, backgrounds, and interests, then you have to work harder to overcome the limitations in your perspective. After all, simply ignoring the limits of your perspective doesn’t make those limits go away.
- Allow for ambiguity. Human lives are fluid and messy, so make room for that wherever possible. Exploring and harnessing the potential of existing data takes skill and creativity–these same creative efforts should be applied to the processes of classification and the design of categories that inform the scope and shape of data in the first place. In particular, researchers should be critical of binary oppositions in platform design or coding schemes.
- Give it a human touch. Build in meaningful opportunities for human oversight and—where possible—make those opportunities transparent to users. Take a cue from these Twitter bot makers that talk about deliberately restraining the frequency of Tweets their bots produce. Though their bots could technically tweet at staggeringly rapid rates, limiting the number of Tweets produced allows for meaningful oversight and intervention on the part of makers and owners.
- Respect people’s time. If you consult outsiders to help you overcome blind spots (and you should!), don’t expect them to offer their input or advice for free. Budget for this from the start and compensate people appropriately for their time. Similarly, if you rely on a team member from a minority or underserved background, recognize their efforts appropriately—being the representative for all things “diversity” in an otherwise homogenous environment is mentally and emotionally taxing work.
- Exercise humility. Time and time again, I hear people say they want to “get things right” with regard to diverse communities of users. Disavow yourself of this notion. Begin by recognizing that there is rarely a clear “right” thing to do; it’s not about perfection, it’s about making some progress and continually trying to do better. If someone raises an issue, listen carefully and with a willingness to adjust and change. After all, listening without being willing to learn and change isn’t really listening at all.
Photo Credit: www.lifeofpix.com
About the Contributor, Dr. Anna Lauren Hoffmann: Dr. Hoffmann is a trans woman and academic working at the intersections of data, technology, culture, and ethics. She is a postdoctoral scholar and lecturer at the University of California, Berkeley School of Information where focuses on the ways in which the design and use of information technology can promote or hinder the pursuit of important human values like respect and justice. In addition to her research and writing, she teaches and develops coursework on issues of data and ethics for undergraduates in the Data Science Education Program and at the graduate level with the Datascience@Berkeley Master of Information in Data Science program.