By Erhardt Graeff
Olin College of Engineering
I advise two programs at Olin College of Engineering that invite undergraduate students to conduct community-engaged design work. In the fall of 2019, project teams in both of those programs decided not to design systems requested by their outside collaborators based on ethical concerns about the harm they might cause. This paper briefly describes how those decisions came to be, the need to educate for and celebrate design refusal, and how this exemplifies the need to develop the next generation of designers and technologists to be citizen professionals.
In the course Affordable Design and Entrepreneurship offered jointly by Olin College and Babson College, I lead a project track where students support organizations working on decarceration and alternative justice through technology and data. One of our first potential collaborations was with a prosecutor’s office. They wanted to design a new case management system that instantiated their goals of refusing to prosecute low level offenses and pursuing alternatives to prison time in sentence recommendations. Their ideal system would track performance based on the outcomes of the accused in terms of social well-being and community safety metrics rather than conviction and imprisonment rates.
This new case management system could have a dramatically positive impact. However, my students chose not to design it. The team’s values evolved over the course of their conversations with the prosecutor’s office. First, the students realized that the project’s initial design collaborators would not be their actual stakeholders—individuals at risk of incarceration—undermining a key tenet of the course’s approach to stakeholder-driven design. Second, they realized they wanted their goals and values to embody a community-centered, long-term vision for society like prison abolitionism.
Their research indicated the requested system would need to connect personally identifiable data from community organizations to individual criminal cases, likely expanding the reach of the carceral state, putting community members at greater risk for surveillance and criminalization at the whim of different prosecutors. Instead, the team redoubled their efforts to connect with stakeholders in the community and find alternative approaches that better fit their newfound values, so as not to simply optimize for large-scale impact.
Olin College offers an extracurricular, student-led Public Interest Technology Clinic (PInT) that I helped create and currently advise. The cornerstone of PInT is its clinical work with community partners addressing public problems. One clinic team partnered with an organization that aims to prevent human trafficking. The organization wanted to automate the collection of all data (including photos) about women being offered to johns on certain websites.
The students were motivated by the belief that they could create significant positive change through this project—it was well-scoped, within their technical abilities, and seemed like it would help many people. However, investigating the requested system’s technical requirements quickly led the students to asking each other and their partner difficult questions about data handling and privacy. The questions became increasingly ethical and moral as the students learned more about human trafficking, realizing other stakeholders like voluntary sex workers would be at risk and that their partner lacked survivor representation and was organized around serving law enforcement rather than victims. The team decided to adopt principles for their work that demanded data only be delivered to local law enforcement when those law enforcement agencies had expertise handling human trafficking survivors, maintained a policy of not arresting victims, and worked with survivor advocacy groups to help the victims afterwards.
The students chose not to design the requested web scraper because they could not be assured how cases would be handled start to finish. They were uncomfortable possibly putting victims and other stakeholders in greater danger. They explained their decision at length to their partner and produced a presentation to the Olin community that celebrated the end of their project as a success.
In computing, even its intersections with the participatory design field, I find emphasis is given to creating optimal designs. Inappropriate or unaffordable designs might be produced by poor design approaches or choices. Bad outcomes for designs might be produced by environmental factors or poor execution. A partnership can fall through or an unfeasible cost-benefit analysis can sink a project before it launches. But I find few stories about making the affirmative decision to refuse to design something.1
Recent activism among designers and technologists under banners like “#TechWontBuildIt” protest their employers’ contracts with the U.S. Department of Defense or Immigration and Customs Enforcement . This is an important conversation that contributes to how we might think about design and responsibility. I also think it is somewhat different. At the heart of these actions is a question like, “What rights do private sector tech workers have to a voice in the contracts their bosses sign them up to work on?”
Instead, the normative question “What technology work should be permitted?” and the practical question “How do we as designers (and as a society) make that determination?” remain under-explored. These are “public” questions that demand democratic deliberation. I created open-ended educational opportunities for my students to work through real community-engaged projects that accidentally produced valuable experiences consciously refusing to design things. I’m proud of these outcomes, but I can’t easily recreate these circumstances for other students. I need more examples. I need a movement.
Can we imagine a citizen professionalism movement in engineering? We might call members of the movement “citizen engineers” , indicating the entwinement of professional practice with citizenship’s roles and responsibilities. Dzur  calls this “democratic professionalism,” whereby practitioners “do professionalism democratically,” discarding hierarchies of expertise and the depoliticization of work (known to be a problematic pillar of engineering culture ). Citizen professionals engage in “public work”, committing to models of co-creation, joint problem solving, and shared ownership. The emerging “design justice” movement exemplifies designers making such commitments . “Public interest technology,” the nascent field advocated by Ford Foundation, New America, and others, also aspires toward this.
To achieve engineering and design as public work requires understanding that the responsibilities of citizenship are central to your work and that the problems you work on are always “public” problems. A responsibility to the public enlarges individual design tasks to consider a broader set of contexts and ethics involved in computing and engineering. This requires that we break out of the “box” that artificially scopes a design project and to acknowledge the social, political, economic, and other dimensions involved. A citizen professionalism movement would have engineers constantly collaborating with their fellow citizens, investigating whether systems are likely to surveil, criminalize, or exacerbate inequality, and refusing to design those systems not in the public interest.
Encouraging my students to be citizen engineers and asking all designers and technologists to be citizen professionals builds on the tradition of societies like Computer Professionals for Social Responsibility and puts justice and democratic practice at the heart of professional work. I believe the values that stem from acknowledging one’s public purpose can be the basis for cultivating normative vision and the practical tools for participatory design (and design refusal). I am committing to develop this as a pillar of my pedagogy and design practice. I ask for your help as a fellow citizen.