By Zainab Agha, Karla Badillo-Urquiola, Neeraj Chatlani, Ashwaq Alsoubai and Pamela Wisniewski
University of Central Florida Orlando, FL
There is still much to be done to protect all teens from online risks and increase their opportunities to engage safely with others online, but it is important to not forget the needs of particularly vulnerable and “at-risk” youth when we embark on such endeavors. We discuss some of the challenges related to ensuring inclusivity and equity when conducting adolescent online safety research and attempting to design solutions that meet the needs of teens, especially those who benefit most from technology access but cannot rely on parents to keep them safe online. Our goal in attending this workshop is to address challenges in conducting socially responsible adolescent online safety research and ensuring inclusivity, as well as agency for teens, in our participatory design practices.
Adolescent Online Safety; Equity; Inclusion; Social Justice
We tend to associate “equity” and “inclusivity” in research with populations that have physical or mental disabilities, such as those with visual impairments  or who are Autistic . In these cases, technology has been called the “great equalizer ” because it can remove barriers and provide augmented capabilities to those who do not have them . However, with this framing, we may overlook populations that are marginalized due to other social and cultural factors. For instance, when researching adolescent online safety, youth that either identify as LGBT, belong to ethnic and racial minorities, lack parental involvement, or those in the foster care system  are the most “at-risk” of being harmed online. At-risk youth have unique sets of characteristics and challenges that create substantial nuance and require a deeper understanding of users beyond what we may typically attribute to a broader population of teens. In our future work, we are committed to addressing digital inequalities that can help improve the lives and online safety of youth, by ensuring that marginalized teen populations have a voice in the design of technology built for them. Thus, attending the Collective Organizing and Social Responsibility Workshop will provide us with an invaluable opportunity to discuss the challenges of supporting youth in participatory design, so that they are active and respected agents in designing equitable solutions for online safety.
Walker et al. created heuristic guidelines for conducting research with at-risk populations and highlighted the importance of weighing the cost-benefit analysis of researching with marginalized groups. A critical step of their heuristic framework was to give back to the community by reporting the findings and outcomes to broader audiences, especially those that participated in the research. As adolescent online safety researchers, it is our social responsibility to engage in such “scholar activism”  and present the findings to the teen community, in a way that teens are aware of the meaningful impact their contribution had on other youth and their families. For instance, researchers could conduct online safety workshops for teen participants, as well as their families and communities, providing practical ways to empower teens in having control of their own online safety. In disseminating findings, researchers can create user-friendly resources for participants’ benefit. Additionally, adolescent online safety researchers are encouraged to directly involve participants by doing participatory action research, which benefits the communities we work with .
Therefore, we aim to engage with and uplift the voices of adolescents in online safety research, and find ways for them to be active agents in the participatory design process. Our research team is working on several participatory design studies with teens to (a) establish best practices around partnering with teens as primary stakeholders in the design of equitable interventions for adolescent online safety, (b) develop teen-centric and viable solutions that address a broad range of online risk scenarios relevant to youth (especially those at higher risk), and (c) work with teens to understand how we can leverage their existing social ecologies of support (e.g., family, peers, school, and community) as part of these online safety intervention [1,4,19,20].
We outline three of the main concerns that have arisen out of our research that we would like to discuss with the other workshop participants.
How do we engage those who are inherently disengaged? In conducting family-based research, self-selection processes around recruitment create implicit bias toward parents (and, more specifically, mothers) who are more actively engaged in their children’s’ lives than those who are not . Otherwise, they would not opt to participant in the research, nor consent for their teens to do so. Yet, this sub-population of families with highly involved and, likely, well-educated parents, is often not the population who would benefit most from research or evidence-based intervention programs developed for youth. Therefore, how do we incentivize participation by those who would most benefit from being a part of our research? In other words, how do we ensure inclusivity and equity in our recruiting practices in addition to gaining access to vulnerable populations that could most benefit from our work?
How can we include diverse perspectives and engage all stakeholders when we design interactive systems? Participatory design encourages the researcher to place value on the stakeholder’s knowledge by allowing the stakeholder to share their perspectives. Researchers have proposed using participatory design techniques to involve the user in the research and design process . This technique has been successful when working with teen populations . Three principles that are known to facilitate the successful participation of teens are: 1) transparency – clearly expressing the objectives to the participant, 2) autonomy – the participant has control of what and how information is shared, and 3) literacy – the participant has both a technical and social knowledge of the problem. We are interested in understanding how we may apply this technique and are open to suggestions for other approaches, which might be appropriate for engaging in user-centric design processes with “at-risk” (particularly foster) youth.
How we ensure that we are creating solutions that equally benefit all users in a fair way? When examining current approaches for protecting youth from online risks , especially those who are at highest risk , we found that many solutions rely heavily on risk prevention through restricting access to technology . Yet, research shows that access to technology can be beneficial by reducing inequalities and providing valuable resources to youth; for example, it can facilitate social support  and access critical information related to medical health, school work, and future employment . Thus, while such restrictive solutions may protect some youth, they may also create inequities among others, as research has shown that socio-economic status plays a large role in family values and practices around technology . Therefore, when designing and evaluating real-world solutions, how can we ensure that the benefits they provide (or harm they cause) are equitable across all users?
We identified the main challenges of inclusivity in recruitment and design when conducting socially responsible adolescent online safety research. Our goal in attending this workshop is to engage in constructive discussion with academics, designers, and practitioners to find actionable ways of creating inclusive designs and conducting socially responsible adolescent online safety research. By attending this workshop, we hope to create a network of researchers with whom we can share experiences and collectively move towards activism and advocacy in HCI research.
This research was supported by the William T. Grant Foundation (#187941, #190017) and National Science Foundation under grants CHS-1844881, IIP-1827700, and CNS-1814439. Any opinion, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of our sponsor.
Zainab Agha is a second year Ph.D. student and ORC Doctoral Fellow in the Department of Computer Science at the University of Central Florida. Her research takes a teen-centric approach to online safety, focusing on parent-teen collaboration in co-designing online safety interventions.
Karla Badillo-Urquiola is a Ph.D. candidate and McKnight Doctoral Fellow in Modeling and Simulation at the University of Central Florida. She leverages her interdisciplinary background to investigate online safety and privacy for teens in the foster care system.
Neeraj Chatlani is a Ph.D. student in the Department of Modelling and Simulation at the University of Central Florida. His current work involves the creation of risk models of teen online interactions, and the use of participatory design to engage teens in the co-development of online safety strategies.
Pamela J. Wisniewski is the Director of the STIR Lab and an Associate Professor in the Department of Computer Science at the University of Central Florida. Her work lies at the intersection of Social Computing and Privacy and she is an expert in the interplay between social media, privacy, and online safety for adolescents. https://stirlab.org/
Zainab Agha, Neeraj Chatlani, Afsaneh Razi, and Pamela Wisniewski. 2020. Towards Conducting Responsible Research with Teens and Parents regarding Online Risks. Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems, Association for Computing Machinery, 1–8.
Morgan G. Ames, Janet Go, Joseph “Jofish” Kaye, and Mirjana Spasojevic. 2011. Understanding technology choices and values through social class. Proceedings of the ACM 2011 conference on Computer supported cooperative work, Association for Computing Machinery, 55–64.
Karla Badillo-Urquiola, Scott Harpin, and Pamela Wisniewski. 2017. Abandoned but Not Forgotten: Providing Access While Protecting Foster Youth from Online Risks. Proceedings of the 2017 Conference on Interaction Design and Children, Association for Computing Machinery, 17–26.
Karla Badillo-Urquiola, Diva Smriti, Brenna McNally, Evan Golub, Elizabeth Bonsignore, and Pamela J. Wisniewski. 2019. Stranger Danger!: Social Media App Features Co-designed with Children to Keep Them Safe Online. Proceedings of the Interaction Design and Children on ZZZ - IDC ’19, ACM Press, 394–406.
Adam Bell and Katie Davis. 2016. Learning through Participatory Design: Designing Digital Badges for and with Teens. Proceedings of the The 15th International Conference on Interaction Design and Children, Association for Computing Machinery, 218–229.
Dale Fitch. 2012. Youth in Foster Care and Social Media: A Framework for Developing Privacy Guidelines. Journal of Technology in Human Services 30, 2: 94–108.
Nora Gustavsson and Ann MacEachron. 2015. Positive Youth Development and Foster Care Youth: A Digital Perspective. Journal of Human Behavior in the Social Environment 25, 5: 407–415.
Charles R. Hale. 2008. Engaging Contradictions: Theory, Politics, and Methods of Activist Scholarship. .
Gillian R. Hayes. 2014. Knowing by Doing: Action Research as an Approach to HCI. In J.S. Olson and W.A. Kellogg, eds., Ways of Knowing in HCI. Springer, New York, NY, 49–68.
Simon Huang, Lynsey J. Martin, Calvin H. Yeh, et al. 2018. The effect of an infographic promotion on research dissemination and readership: A randomized controlled trial. Canadian Journal of Emergency Medicine 20, 6: 826–833.
Anthony T. Pinter, Pamela J. Wisniewski, Heng Xu, Mary Beth Rosson, and Jack M. Caroll. 2017. Adolescent Online Safety: Moving Beyond Formative Evaluations to Designing Solutions for the Future. Proceedings of the 2017 Conference on Interaction Design and Children, Association for Computing Machinery, 352–357.
Erika S. Poole and Tamara Peyton. 2013. Interaction design research with adolescents: methodological challenges and best practices. Proceedings of the 12th International Conference on Interaction Design and Children, Association for Computing Machinery, 211–217.
Kyle Rector, Roger Vilardaga, Leo Lansky, et al. 2017. Design and Real-World Evaluation of Eyes-Free Yoga: An Exergame for Blind and Low-Vision Exercise. ACM transactions on accessible computing 9, 4.
Kiley Sobel, Katie O’Leary, and Julie A. Kientz. 2015. Maximizing children’s opportunities with inclusive play: considerations for interactive technology design. Proceedings of the 14th International Conference on Interaction Design and Children, Association for Computing Machinery, 39–48.
Pamela Wisniewski, Arup Kumar Ghosh, Heng Xu, Mary Beth Rosson, and John M. Carroll. 2017. Parental Control vs. Teen Self-Regulation: Is there a middle ground for mobile online safety? Proceedings of the 2017 ACM Conference on Computer Supported Cooperative Work and Social Computing, Association for Computing Machinery, 51–69.
Janis Wolak, David Finkelhor, Kimberly J. Mitchell, and Michele L. Ybarra. 2010. Online “predators” and their victims: Myths, realities, and implications for prevention and treatment. Psychology of Violence 1, S: 13–35.
2012. Technology Change as the Great Equalizer. whitehouse.gov. Retrieved October 7, 2020 from https://obamawhitehouse.archives.gov/blog/2012/05/07/technology-change-great-equalizer.
The digital hood: Social media use among youth in disadvantaged neighborhoods - Robin Stevens, Stacia Gilliard- Matthews, Jamie Dunaev, Marcus K Woods, Bridgette M Brawner, 2017. Retrieved October 7, 2020 from https://journals.sagepub.com/doi/10.1177/1461444815625941.
NSF Award Search: Award#1844881 - CAREER: Safety by Design: Protecting Adolescents from Online Risks. Retrieved September 30, 2020 from https://www.nsf.gov/awardsearch/showAward?AWD_ID=1844881.
NSF Award Search: Award#1827700 - PFI-RP: A Multi-Disciplinary Approach to Detecting Adolescent Online Risks. Retrieved January 5, 2020 from https://www.nsf.gov/awardsearch/showAward?AWD_ID=1827700.