As she was starting her computer science PhD at the University of Maryland, Elissa Redmiles noticed an important gap in the research that shaped digital security: Few scholars were really considering how a different identity or life experience might influence an individual’s perspective on staying safe. A number of studies looked into how human abilities or behaviors influence technical design, but many of them, she said, shared a paternalistic assumption that people simply didn’t understand security risks—so experts would need to educate or make decisions for them.
Once Redmiles, 30, started investigating, she repeatedly heard otherwise: that people, including some living in precarious conditions, did understand—but sometimes had to balance digital safety against what they considered bigger threats. Sex workers in Europe, for example, told her they couldn’t move to encrypted apps because clients didn’t want to.
Today, as an assistant professor in computer science at Georgetown University, Redmiles is building on this research and pioneering new ways of incorporating user participation into computer science and security. She combines social science, economics, and computational methods that allow developers to prioritize different factors that affect security decisions, depending on the use case.
During the pandemic, she conducted research to better understand how people felt about covid-19 contact tracing applications, and she offered empirically backed guidance based on the results to seven US states and foreign countries in adopting such apps.
More recently, Redmiles has been combating image-based sexual abuse. She is a founder of SafeDigitalIntimacy.org, a research collective that provides data and tools for policymakers in government and at tech companies to slow the nonconsensual sharing of intimate images. She also co-authored the “IBSA Principles,” which offer guidelines to help governments and platforms combat image-based sexual abuse from the start. The program was announced last year by then-President Joe Biden and has been co-signed by 10 major tech platforms, including Microsoft and Facebook.
Redmiles hopes that all of this work will ultimately change how technology is developed: that technical expertise will be balanced with individuals’ lived experiences to better detect unsafe outputs from generative AI, determine privacy protections in apps, or decide which tools to deploy to protect intimate digital interactions.