Skip to main content
SearchLoginLogin or Signup

A Reflection On Social Responsibility and Collective Organizing in Technology

Contribution to the CSCW 2020 Workshop: Collective Organizing and Social Responsibility

Published onOct 15, 2020
A Reflection On Social Responsibility and Collective Organizing in Technology

Personal Statement: A Reflection On Social Responsibility and Collective Organizing in Technology

By Joy Ming

Cornell University Ithaca, New York

[email protected]


participatory design, design justice, research-practice divide

When I first started at Google as a software engineer for the Next Billion Users and building products for “the latest generation of internet users to come online on smartphones in places like Brazil, China, India, Indonesia, and Nigeria” [3], I was chatting with a friend who said something along the lines of “Oh, that sounds like a lot of social responsibility.” I remember thinking this seemed so obvious to me, but it was not something we talked about at work, among coworkers or even the leadership. I was working on a platform of different applications of an offline file-sharing technology—a technology solution in search of a problem—and the main concern was how many users we had rather than whether it was addressing a real need. This was a huge contrast for me, having spent a year doing research in rural India, where there was not really the option of “forgetting” to consider the social impacts of technology.

However, even in contexts where social responsibility is considered, I have also seen technologies fall short of intended goals. During my first human-computer interaction (HCI) and information and communication technologies for development (ICTD) research project, I developed an intellectually innovative paper-digital system, but that system did not, in fact, impact the health and microfinance institutions we were working with in Ghana because it was never used [7, 18]. Similarly, as a Fulbright student research scholar, during the aforementioned year in India, I found that the tablet-based application I was studying to reduce maternal mortality in rural peripheral healthcare centers would not actually be able to do so because its implementation was hindered by the institutional and societal forces that maintained the poor quality of care in the first place [17]. Leading research for the Kormo application at Google, I found that while we were trying to address economic opportunity by matching job seekers to employers, our supply of predominantly gig-related jobs only heightened the instability these workers faced with the “existing modes of oppression” [14].

Angela Davis points out that while there are many initiatives that tout “diversity” and “inclusion,” it “makes no sense to be included in an institution or society that hasn’t changed” [1]. Instead, we should think about transformation. I am interested in figuring out how we can transform the technology creation process, beyond “digital inclusion” (giving more people access to technology) or “product inclusion” (considering a more diverse set of people as potential users, like the “Next Billion Users” that were not originally thought of). I am excited by the opportunity to take inspiration from my experiences and learnings in activist communities, like the Angela Davis quote shared earlier, the concept of the pedagogy of the oppressed [10] embodied in “people’s school,” or exploration of the wisdom of the collective through my Asian-American identity [13].

As I am starting the first year of my PhD, I am learning about all of the amazing work being done to transform technology creation, including concepts like design justice [8], postcolonial computing [12], prefigurative design [6], or social justice oriented HCI [9]. While some of these concepts are fairly nascent (as Design Justice and Data Feminism are both books that were just published this year), one of the problems that I see is that I did not have access to or think to look at these resources when I was outside of academia, in industry. Even though I tried my best to share what I was learning about the important social responsibility concerns of technology through co-founding Harvard College Developers for Development to encourage students to use their computer science skills for social good, hosting a 50+ person book club in the San Francisco Bay Area to teach techies about “development” [15], and, more recently, starting to create visual summaries some of the concepts I was learning about, including design justice [16].

I believe my experiences in both academia and industry put me in a unique position to be a conduit between worlds. Some nascent thoughts I have include:

  1. Making best practices easier to access for people who are not necessarily steeped in academic literature, both in how the information is presented (potentially through more simplified infographics or frameworks) as well as the spaces in which it is circulated (using Medium or other tools that are more popular among practitioners), taking care to carefully craft messages for this different intended audience. For example, I was excited about the Design Justice book and shared it with a friend, who finds it really interesting but dense and difficult to get through more than 4-5 pages at a time.

  2. Developing stronger research-practice partnerships, where practice includes corporations or nonprofits and other community organizations, in order to share said accessible narratives and other best practices. One example is having a cross-discipline working group consulting and co-creating practices—the product inclusion working groups I was a part of in Google would have really benefited from having people who were well-versed in the latest research and findings and practices.

  3. Updating computer science related education curriculum to position future technology creators as people who can consider the context of their creations and the impact it has on people, as two recent initiatives at my alma mater, Harvard, have broached (embedded ethiCS [11] and the public interest tech center [5]), as well as the recent calls to action for antiracism in curriculums such as Black in Computing [2] or the Critical Race and Digital Studies Syllabus [4].

I am excited to learn more about what others at this workshop are thinking about and what experiences they have had in order to come up with ways to share the knowledge more broadly across research and practice.


  1. [n.d.]. Angela Y. Davis & Ibram X. Kendi | City Arts & Lectures. kendi/

  2. [n.d.]. An Open Letter & Call to Action to the Computing Community from Black computer scientists and our allies.

  3. 2018. The next billion users are the future of the internet. users-are-future-internet/

  4. 2019. Critical Race & Digital Studies Syllabus.

  5. 2019. The intersection of tech and public interest. technology/

  6. Mariam Asad. 2019. Prefigurative Design as a Method for Research Justice. Proceedings of the ACM on Human-Computer Interaction 3, CSCW (Nov. 2019), 200:1–200:18.

  7. Jay Chen, Azza Abouzied, David Hutchful, Joy Ming, and Ishita Ghosh. 2016. printr: Exploring the Potential of Paper-based Tools in Low-resource Settings. In Proceedings of the Eighth International Conference on Information and Communication Technologies and Development (ICTD ’16). Association for Computing Machinery, New York, NY, USA, 1–11.

  8. Sasha Costanza-Chock. 2020. Design Justice: Community-Led Practices to Build the Worlds We Need. The MIT Press, Cambridge, MA.

  9. Lynn Dombrowski, Ellie Harmon, and Sarah Fox. 2016. Social Justice-Oriented Interaction Design: Outlining Key Design Strategies and Commitments. In Proceedings of the 2016 ACM Conference on Designing Interactive Systems (DIS ’16). Association for Computing Machinery, New York, NY, USA, 656–671.

  10. Paulo Freire and Donaldo Macedo. 2000. Pedagogy of the Oppressed, 30th Anniversary Edition (30th anniversary edition ed.). Continuum, New York.

  11. Barbara J. Grosz, David Gray Grant, Kate Vredenburgh, Jeff Behrends, Lily Hu, Alison Simmons, and Jim Waldo. 2019. Embedded EthiCS: integrating ethics across CS education. Commun. ACM 62, 8 (July 2019), 54–61. 1145/3330794

  12. Lilly Irani, Janet Vertesi, Paul Dourish, Kavita Philip, and Rebecca E. Grinter. 2010. Postcolonial computing: a lens on design and development. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’10). Association for Computing Machinery, New York, NY, USA, 1311–1320.

  13. Gish Jen. 2017. The Girl at the Baggage Claim: Explaining the East-West Culture Gap (first edition ed.). Knopf, New York.

  14. Neha Kumar, Nassim Jafarinaimi, and Mehrab Bin Morshed. 2018. Uber in Bangladesh: The Tangled Web of Mobility and Justice. Proceedings of the ACM on Human-Computer Interaction 2, CSCW (Nov. 2018), 98:1–98:21. 3274367

  15. Joy M. 2020. Themes in technology and social impact. social-impact-f7ea18ef592e

  16. Joy M. 2020. Visual summary of “Design Justice” by Sasha Costanza-Chock. summary-of-design-justice-by-sasha-costanza-chock-614480e7b8d8

  17. Joy Ming. 2016. Beyond the Partograph: Lessons from the Field on Designing Information Communication Technologies for Childbirth in Remote Healthcare Centers in India. In Proceedings of the Eighth International Conference on Information and Communication Technologies and Development (ICTD ’16). Association for Computing Machinery, New York, NY, USA, 1–4.

  18. Joy Ming, Ishita Ghosh, Jay Chen, and Azza Abouzied. 2014. Printing Paper Technology for Development. In Proceedings of the Fifth ACM Symposium on Computing for Development (ACM DEV-5 ’14). Association for Computing Machinery, New York, NY, USA, 117–118.

No comments here
Why not start the discussion?