2023 NSF Fellowship Recipients

Congratulations to two Security Lab PhD students for being selected as National Science Foundation Graduate Research Fellowship Program (GRFP) recipients!! Cross-posting the details from Allen School News, written by Jennifer Webster:

Portrait of Rachel Hong against a blurred backdrop of a brick building, smiling and wearing a white button up cotton blouse and a navy suit jacket.

Rachel Hong

Fellowship recipient Rachel Hong is a first-year Ph.D. student. She works with Allen School professors Jamie Morgenstern, who focuses on the social impacts of ML, and Tadayoshi (Yoshi) Kohno, co-director of the Security and Privacy Research Lab.

Combining ML, security and technology policy, Hong explores the behavior of existing ML algorithms in relation to privacy and fairness, as well as how to prevent those algorithms from being misapplied in society. As an undergraduate student, Hong was introduced to the field of algorithmic fairness through building a novel representation learning algorithm on biomedical data to help patients receiving care at a variety of hospitals in both rural and urban settings. Hong seeks to build on that foundation to improve algorithmic fairness through examining demographic biases in facial recognition technology to better understand how various modifications of training data can mitigate disparate outcomes.

Portrait of Alexandra Michael against a blurred blue sky, smiling and wearing oval wire-rimmed classes frames, small drop earrings with butterflies and a navy top.

Alexandra Michael

First-year Ph.D. student Alexandra Michael received a fellowship for her work that is co-advised by Allen School professors David Kohlbrenner in the Security and Privacy Research Lab and Dan Grossman in the Programming Languages and Software Engineering (PLSE) group.

Michael’s research combines her interests in security, programming languages and compilers. Prior to graduate school, Michael was fascinated by how computers could connect people yet put them at risk. Her work focuses on mitigating those risks by leveraging programming languages and security tools to improve the security and privacy of systems and the people who use them. She proposes to build a highly performant, secure and portable low-level language that will act as target for programs written in unsafe languages.

Tina Yeung at the WebConf 2023

Tina Yeung presenting "Online Advertising in Ukraine and Russia During the 2022 Russian Invasion" at the ACM WebConf 2023
Tina Yeung presenting “Online Advertising in Ukraine and Russia During the 2022 Russian Invasion” at the ACM WebConf 2023

Congratulations to Tina Yeung for a great talk at the ACM Web Conference 2023 in Austin, Texas! Tina presented the paper “Online Advertising in Ukraine and Russia During the 2022 Russian Invasion“, which you can read more about here. And even more congratulations to Tina and her co-authors for having the paper selected as a “Spotlight Paper” for the conference, which means it was nominated for the Best Paper Award and received a longer (20-minute) presentation slot.

Test of Time Award for NSDI 2012 work on web tracking

(Cross-posted from Allen School News, by Kristin Osborne)

Tadayoshi Kohno and Franziska Roesner smiling and standing side by side, hands clasped in front of them, against a wall painted with visible brush strokes in shades of blue, both wearing lanyards with NSDI name tags around their necks. Kohno is wearing a grey zip-up sweatshirt over a purple t-shirt, and Roesner is wearing a blue floral-patterned blouse with the sleeves rolled up and a smartwatch with a blue wristband.
Tadayoshi Kohno (left) and Franziska Roesner at NSDI 2023. Photo by Liz Markel, courtesy of USENIX

There was a time when cookies were considered something to be savored — back when chips referred to chocolate rather than silicon. Once “cookies” became synonymous with online tracking, privacy researchers weren’t so sweet on the concept. 

That includes Allen School professors Franziska Roesner and Tadayoshi Kohno, who investigated the online tracking ecosystem for their 2012 paper “Detecting and Defending Against Third-Party Tracking on the Web.” Last month, Roesner, Kohno and co-author David Wetherall, a former Allen School professor who is now a Distinguished Engineer at Google, received the Test of Time Award at the 20th USENIX Symposium on Networked Systems Design and Implementation (NSDI 2023) for their influential work, which offered the first comprehensive evaluation of third-party trackers and their intrusion into people’s activities online. 

The team’s findings informed the nascent policy debate around web privacy that has become all the more relevant with the proliferation of social media and reliance on targeted advertising as a revenue model. They also led to the creation of new tools like Privacy Badger, a browser extension that learns and automatically blocks hidden third-party trackers used by millions of people to protect themselves and their browsing histories online. The work also inspired a significant body of follow-on research, including team members’ subsequent paper that appeared at NSDI 2016 chronicling the increase in both the prevalence of online tracking and the complexity of tracker behavior over time.

“Considering how much time we spend online and the variety of activities we engage in, this type of tracking can yield a lot of information about a person,” said Roesner, a co-director of the Security and Privacy Research Lab at the University of Washington along with Kohno. “That’s even truer today than it was a decade ago, and I’m gratified that our work helped initiate such an important conversation and informed efforts to educate and empower users.”

At the time of the original paper’s release, third-party tracking had started to gain attention in security and privacy circles. But researchers were just nibbling around the edges, for the most part; they had a fragmented understanding of how such trackers worked and their impact on people’s online experience. Roesner — an Allen School Ph.D. student at the time — worked with Kohno and Wetherall to develop a client-side method for detecting and classifying trackers according to how they interact with the browser. They analyzed tracker prevalence and behavior on the top 500 website domains, as identified by the now-defunct web traffic analysis firm Alexa Internet, examining more than 2,000 unique pages.

“We identified 524 unique trackers, some of which had sufficient penetration across popular websites to enable them to capture a significant fraction of a user’s browsing activity — typically around 20%, and in one case, as much as 66%,” Roesner recalled.

Roesner and her colleagues cataloged five types of tracker behavior, varying from the relatively benign, to the opportunistic, to the infuriating. The behaviors spanned analytics that are generally confined to a specific site, Google Analytics being an example; “vanilla” trackers, which rely on third-party storage to track users across sites for the purposes of additional analytics or targeted advertising, such as Doubleclick; forced, which include the dreaded popup or redirect that compels the user to visit its domain; referred, which rely on unique identifiers leaked by other trackers; and personal trackers, which engage in cross-site tracking based on a user’s voluntary visit to its domain in other contexts. Some trackers exhibit a combination of the above.

Despite the existence of multiple tools intended to give users more control, from third-party cookie blockers to “private” browsing mode, the team found those options insufficient for preventing certain trackers from following people across the web while maintaining any semblance of functionality. This was particularly true for popular social widgets by the likes of Facebook, Twitter, LinkedIn, Digg, and others that were embedded on a growing number of sites ranging from news outlets to online storefronts.

Portrait of David Wetherall against a dark building interior, smiling and wearing wireframe glasses and a black zip-up top over a lavender collared shirt.
David Wetherall

“While users could prevent some tracking, that was not the case for social widgets,” noted Roesner. “If a user was logged into a social media site like Facebook, for instance, their activity elsewhere on the web would be tracked — non-anonymously, I would add — even if they didn’t interact with the ‘like’ button embedded on those sites.”

For those who would prefer to cover their tracks while continuing to enjoy the convenience of interacting with social widgets on their terms, Roesner and her collaborators developed ShareMeNot. The browser extension took a bite out of social widgets’ ability to construct browsing profiles of users by only allowing activation of third-party tracking cookies when a user explicitly interacted with the “like,” “share,” or other relevant buttons; if a user visited a site but did not click on the social widgets, ShareMeNot stripped the cookies from any third-party requests to those trackers.

The team worked with an undergraduate research assistant in the lab, Chris Rovillos (B.S., ‘14) to refine ShareMeNot following the paper’s initial publication and address instances of the trackers attempting to circumvent the restrictions on cookies via other means. Instead of just blocking cookies, the new and improved version of the tool blocked tracker buttons altogether. In their place, ShareMeNot inserted local, stand-in versions of the buttons that users could click to either “like” a page directly or load the real button — putting users, not the trackers, in control. Roesner partnered with the nonprofit Electronic Frontier Foundation to incorporate ShareMeNot into the previously mentioned Privacy Badger, which remains an important tool for protecting users from intrusion by third-party trackers to this day.

The team’s work is notable for inspiring not only new technologies but also a new wave of researchers to focus on web tracking. One of those researchers, Umar Iqbal, followed that inspiration all the way to the Allen School.

“This is one of the seminal works in the space of web privacy and security. It had an immense influence on the community, including my own research,” observed Iqbar, a postdoc in the Security and Privacy Research Lab. “I extended several of the techniques proposed in the paper as part of my own doctoral thesis, from the measurement of online trackers, to their characterization, to building defenses. It was, in fact, one of the reasons I decided to pursue a postdoc with Franzi at UW!”

Roesner, Kohno and Wetherall were formally recognized at NSDI 2023 last month in Boston, Massachusetts. Read the research paper here. Read USENIX’s story here.

Honorable Mention for NSA Best Scientific Cybersecurity Paper

Congratulations to Alaa Daffalla (Cornell), Lucy Simko (UW Security Lab alumna), Tadayoshi Kohno (UW Security Lab faculty member), and Alexandru Bardas (Kansas) for being recognized with the Honorable Mention in the NSA’s 10th Annual Best Scientific Cybersecurity Paper Competition!! This is a huge honor and important work. You can read the paper, which was published at the IEEE Symposium on Security & Privacy in 2021, here: “Defensive Technology Use by Political Activists During the Sudanese Revolution”.

Tadayoshi Kohno named IEEE Fellow

(Cross-posted from Allen School News, by Kristin Osborne)

Portrait of a smiling Tadayoshi Kohno wearing a light blue polo shirt standing on warm-toned wooden stairs with metal and glass railings
IEEE Fellow Tadayoshi Kohno’s contributions to security and privacy span multiple domains, from the automobile and medical device industries, to electronic voting and mixed reality.

IEEE honored Kohno for “contributions to cybersecurity” — an apt reference to Kohno’s broad influence across a variety of domains. As co-director of the Allen School’s Security and Privacy Research Lab and the UW Tech Policy Lab, Kohno has explored the technical vulnerabilities and societal implications of technologies ranging from do-it-yourself genealogy research, to online advertising, to mixed reality. 

His first foray into high-profile security research, as a Ph.D. student at the University of California San Diego, struck at the heart of democracy: security and privacy flaws in the software that powered electronic voting machines. What Kohno and his colleagues discovered shocked vendors, elections officials, and other cybersecurity experts.

“Not only could votes and voters’ privacy be compromised by insiders with direct access to the machines, but such systems were also vulnerable to exploitation by outside attackers as well,” said Kohno. “For instance, we demonstrated that a voter could cast unlimited votes undetected, and they wouldn’t require privileged access to do it.”

After he joined the University of Washington faculty, Kohno turned his attention from safeguarding the heart of democracy to an actual heart when he teamed up with other security researchers and physicians to study the security and privacy weaknesses of implantable medical devices. They found that devices such as pacemakers and cardiac defibrillators that rely on embedded computers and wireless technology to enable physicians to non-invasively monitor a patient’s condition were vulnerable to unauthorized remote interactions that could reveal sensitive health information — or even reprogram the device itself. This groundbreaking work earned Kohno and his colleagues a Test of Time Award from the IEEE Computer Society Technical Committee on Security and Privacy in 2019.

“To my knowledge, it was the first work to experimentally analyze the computer security properties of a real wireless implantable medical device,” Kohno recalled at the time, “and it served as a foundation for the entire medical device security field.”

Kohno and a group of his students subsequently embarked on a project with researchers at his alma mater that revealed the security and privacy risks of increasingly computer-dependent automobiles, and in dramatic fashion: by hacking into a car’s systems and demonstrating how it was possible to take control of its various functions.

“It took the industry by complete surprise,” Kohno said in an article published in 2020. “It was clear to us that these vulnerabilities stemmed primarily from the architecture of the modern automobile, not from design decisions made by any single manufacturer … Like so much that we encounter in the security field, this was an industry-wide issue that would require industry-wide solutions.” 

Those industry-wide solutions included manufacturers dedicating new staff and resources to the cybersecurity of their vehicles, the development of new national automotive cybersecurity standards, and creation of a new cybersecurity testing laboratory at the National Highway Transportation Safety Administration. Kohno and his colleagues have been recognized multiple times and in multiple venues for their role in these developments, including a Test of Time Award in 2020 from the IEEE Computer Society Technical Committee on Security and Privacy and a Golden Goose Award in 2021 from the American Association for the Advancement of Science. And most importantly, millions of cars — and their occupants — are safer as a result. 

Kohno has journeyed into other uncharted territory by exploring how to mitigate privacy and security concerns associated with nascent technologies, from mixed reality to genetic genealogy services. For example, he and Security and Privacy Research Lab co-director Franziska Roesner have collaborated on an extensive line of research focused on safeguarding users’ security and privacy in augmented-reality environments. The results include ShareAR, a suite of developer tools for safeguarding users’ privacy while enabling interactive features in augmented-reality environments. They also worked with partners in the UW Reality Lab to organize a summit for members of academia and industry and issue a report exploring design and regulatory considerations for ensuring the security, privacy and safety of mixed reality technologies. Separately, Kohno teamed up with colleagues in the Molecular Information Systems Lab to uncover how vulnerabilities in popular third-party genetic genealogy websites put users’ sensitive personal genetic information at risk. Members of the same team also demonstrated that DNA sequencing software could be vulnerable to malware encoded into strands of synthetic DNA, an example of the burgeoning field of cyber-biosecurity. 

Kohno’s contributions to understanding and mitigating emerging cybersecurity threats extend to autonomous vehicle algorithms, mobile devices, and the Internet of Things. Although projects exposing the hackability of cars and voting machines may capture headlines, Kohno himself is most captivated by the human element of security and privacy research — particularly as it relates to vulnerable populations. For example, he and his labmates recently analyzed the impact of electronic monitoring apps on people subjected to community supervision, also known as “e-carceration.” Their analysis focused on not only the technical concerns but also the experiences of people compelled to use the apps, from privacy concerns to false reports and other malfunctions. Other examples include projects exploring security and privacy concerns of recently arrived refugees in the United States, with a view to understanding how language barriers and cultural differences can impede the use of security best practices and make them more vulnerable to scams, and technology security practices employed by political activists in Sudan in the face of potential government censorship, surveillance, and seizure. 

“I chose to specialize in computer security and privacy because I care about people. I wanted to safeguard people against the harms that can result when computer systems are compromised,” Kohno said. “To mitigate these harms, my research agenda spans from the technical — that is, understanding the technical possibilities of adversaries as well as advancing technical approaches to defending systems — to the human, so that we also understand people’s values and needs and how they prefer to use, or not use, computing systems.”

In addition to keeping up with technical advancements that could impact privacy and security, Kohno is also keen to push the societal implications of new technologies to the forefront. To that end, he and colleagues have investigated a range of platforms and practices, from the development of design principles that would safeguard vulnerable and marginalized populations to understanding how online political advertising contributes to the spread of misinformation, to educate and support researchers and developers of these technologies. He has also attempted to highlight the ethical issues surrounding new technologies through a recent foray into speculative and science fiction writing. For example, his self-published novella “Our Reality” explores how mixed reality technologies designed with a default user in mind can have real-world consequences for people’s education, employment, access to services, and even personal safety.

“It’s important as researchers and practitioners to consider the needs and concerns of people with different experiences than our own,” Kohno said. “I took up fiction writing for the joy of it, but also because I wanted to enable educators and students to explore some of the issues raised by our research in a more accessible way. Instead of identifying how a technology might have gone wrong, I want to help people focus from the start on answering the question, ‘how do we get this right?’”

SOUPS and USENIX Security 2022

Many members of the UW Security and Privacy Research Lab were thrilled last week to finally re-join our broader research community in person in Boston, at SOUPS and USENIX Security 2022. It was fantastic to see some of our alumni, talk in person with current and future collaborators, meet new members of the community, catch up with old and new friends, and more!

UW Security Lab members and alumni at USENIX Security 2022: Yoshi Kohno, Ada Lerner, Kentrell Owens, Kimberly Ruth, Earlence Fernandes, Eric Zeng, Umar Iqbal, Kaiming Cheng, Miranda Wei, and Franzi Roesner

Our members presented a great set of talks across both conferences:

Designing beyond the default: Allen School researchers receive NSF award to address privacy and security needs of marginalized and vulnerable populations

(Cross-posted from Allen School News, by Kristin Osborne)

For people around the world, technology eases the friction of everyday life: bills paid with a few clicks online, plans made and sometimes broken with the tap of a few keys, professional and social relationships initiated and sustained from anywhere at the touch of a button. But not everyone experiences technology in a positive way, because technology — including built-in safeguards for protecting privacy and security — isn’t designed with everyone in mind. In some cases, the technology community’s tendency to develop for a “default persona” can lead to harm. This is especially true for people who, whether due to age, ability, identity, socioeconomic status, power dynamics or some combination thereof, are vulnerable to exploitation and/or marginalized in society.

Researchers in the Allen School’s Security & Privacy Research Lab have partnered with colleagues at the University of Florida and Indiana University to provide a framework for moving technology design beyond the default when it comes to user security and privacy. With a $7.5 million grant from the National Science Foundation through its Secure and Trustworthy Cyberspace (SaTC) Frontiers program, the team will blend computing and the social sciences to develop a holistic and equitable approach to technology design that addresses the unique needs of users who are underserved by current security and privacy practices.

“Technology is an essential tool, sometimes even a lifeline, for individuals and communities. But too often the needs of marginalized and vulnerable people are excluded from conversations around how to design technology for safety and security,” said Allen School professor and co-principal investigator Franziska Roesner. “Our goal is to fundamentally change how our field approaches this question to center the voices of marginalized and vulnerable people, and the unique security and privacy threats that they face, and to make this the norm in future technology design.”

To this end, Roesner and her collaborators — including Allen School colleague and co-PI Tadayoshi Kohno — will develop new security and privacy design principles that focus on mitigating harm while enhancing the benefits of technology for marginalized and vulnerable populations. These populations are particularly susceptible to threats to their privacy, security and even physical safety through their use of technology: children and teenagers, LGBTQ+ people, gig and persona workers, people with sensory impairments, people who are incarcerated or under community supervision, and people with low socioeconomic status. The team will tackle the problem using a three-prong approach, starting with an evaluation of how these users have been underserved by security and privacy solutions in the past. They will then examine how these users interact with technology, identifying both threats and benefits. Finally, the researchers will synthesize what they learned to systematize design principles that can be applied to the development of emerging technologies, such as mixed reality and smart city technologies, to ensure they meet the privacy and security needs of such users.

The researchers have no intention of imposing solutions on marginalized and vulnerable communities; a core tenet of their proposal is direct consultation and collaboration with affected people throughout the duration of the project. They will accomplish this through both quantitative and qualitative research that directly engages communities in identifying their unique challenges and needs and evaluating proposed solutions. The team will apply these insights as it explores how to leverage or even reimagine technologies to address those challenges and needs while adhering to overarching security and privacy goals around the protection of people, systems, and data.

The team’s approach is geared to ensuring that the outcomes are relevant as well as grounded in rigorous scientific theory. It’s a methodology that Roesner, Kohno, and their colleagues hope will become ingrained in the privacy and security community’s approach to new technologies — but they anticipate the impact will extend far beyond their field.

Portraits of Tadayoshi Kohno and Franziska Roesner separated by diagonal white line
Tadayoshi Kohno (left) and Franziska Roesner. Dennis Wise

“In addition to what this will mean in terms of a more inclusive approach to designing for security and privacy, one of the aspects that I’m particularly excited about is the potential to build a community of researchers and practitioners who will ensure that the needs of marginalized and vulnerable users will be met over the long term,” said Kohno. “Our work will not only inform technology design, but also education and government policy. The impact will be felt not only in the research and development community but also society at large.”

Kohno and Roesner are joined in this work by PI Kevin Butler and co-PIs Eakta Jain and Patrick Traynor at the University of Florida, co-PIs Kurt Hugenberg and Apu Kapadia at Indiana University, and Elissa Redmiles, CEO & Principal Researcher at Human Computing Associates. The team’s proposal, “Securing the Future of Computing for Marginalized and Vulnerable Populations,” is one of three projects selected by NSF in its latest round of SaTC Frontiers awards worth a combined $24.5 million. The other projects focus on securing the open-source software supply chain and extending the “trusted execution environment” principle to secure computation in the cloud.

Read the NSF announcement here and the University of Florida announcement here.

1 2 3 4 28