Introducing Prof. Iqbal

Prof. Umar Iqbal has moved on from his postdoc position in the UW Security Lab to start an assistant professor position at the Washington University in St. Louis. Congratulations to both WashU and to Prof. Iqbal! We are excited to see all the great research you will do next!!

Miranda Wei presents at CHI 2023 and IEEE S&P 2023, passes General Exam

Miranda Wei presenting "Skilled or Gullible? Gender Stereotypes Related to Computer Security and Privacy" at the 2023 IEEE Symposium on Security & Privacy
Miranda Wei presenting “Skilled or Gullible? Gender Stereotypes Related to Computer Security and Privacy” at the 2023 IEEE Symposium on Security & Privacy

Congratulations to Miranda Wei for passing her General Exam today and officially becoming a PhD “Candidate”!

Her PhD dissertation proposal builds in part on her excellent work studying gender stereotypes in computer security and privacy, which she recently presented at the IEEE Symposium on Security & Privacy in San Francisco, and her work (and a wonderful collaboration with Google) studying advice for staying safe for hate and harassment online, which she presented at the ACM Conference on Human Factors in Computing Systems (CHI) in Hamburg, Germany. Congratulations, Miranda!!

Check out Miranda’s papers here:

Introducing Prof. Saadia Gabriel

Saadia Gabriel graduating with her PhD, along with advisors Yejin Choi (left) and Franzi Roesner (right)

Congratulations to newly-minted PhD and soon-to-be Professor Saadia Gabriel! Prof. Gabriel was co-advised by Yejin Choi (UW NLP) and Franzi Roesner (UW Security Lab), and she will be joining UCLA as an assistant professor in the fall of 2024, after some time as a postdoc at MIT then a Faculty Fellow at NYU. Congratulations, Prof. Gabriel!! MIT, NYU, and UCLA are all lucky to get you!

Showcasing Undergraduate and BS/MS Researchers

The UW Security Lab is lucky to work with a number of impressive undergraduate and 5th year Masters students among our researchers. We’re excited to share some of their work that was showcased recently.

Theo Gregersen presenting at the UW Undergraduate Research Symposium
Chongjiu Gao and Sergio Medina presenting at the UW Undergraduate Research Symposium
Camila Alvarez and Petek Mertan presenting at the Allen School Undergraduate and BS/MS Research Showcase

At the UW-wide Undergraduate Research Symposium on May 19, Theo Gregersen (mentored by Prof. Franzi Roesner) presented his undergraduate thesis work on “Software-level Enforcement of Privacy Policies”. Chongjiu Gao, Sergio Medina, and their collaborators from the School of Art+Design (co-mentored by Prof. Roesner in CSE and by Prof. James Pierce in Art+Design) presented their work on “Arca, a Smart Home Camera for Your Entire Household: Designing, Prototyping, and Evaluating an Inclusive Security Camera that Improves Privacy”.

Then, at the first-ever Allen School Undergraduate and BS/MS Research Showcase on May 30, Camila Alvarez and Peter Mertan (mentored also by Prof. Roesner) presented their work on “A Visual Approach: Uncovering Mental Models of Security Threats Through Drawings”. Chongjiu, Sergio, and their collaborators presented again as well, and were recognized as runners-up for the People’s Choice Poster Award!

Congratulations to all of these budding researchers!

2023 NSF Fellowship Recipients

Congratulations to two Security Lab PhD students for being selected as National Science Foundation Graduate Research Fellowship Program (GRFP) recipients!! Cross-posting the details from Allen School News, written by Jennifer Webster:

Portrait of Rachel Hong against a blurred backdrop of a brick building, smiling and wearing a white button up cotton blouse and a navy suit jacket.

Rachel Hong

Fellowship recipient Rachel Hong is a first-year Ph.D. student. She works with Allen School professors Jamie Morgenstern, who focuses on the social impacts of ML, and Tadayoshi (Yoshi) Kohno, co-director of the Security and Privacy Research Lab.

Combining ML, security and technology policy, Hong explores the behavior of existing ML algorithms in relation to privacy and fairness, as well as how to prevent those algorithms from being misapplied in society. As an undergraduate student, Hong was introduced to the field of algorithmic fairness through building a novel representation learning algorithm on biomedical data to help patients receiving care at a variety of hospitals in both rural and urban settings. Hong seeks to build on that foundation to improve algorithmic fairness through examining demographic biases in facial recognition technology to better understand how various modifications of training data can mitigate disparate outcomes.

Portrait of Alexandra Michael against a blurred blue sky, smiling and wearing oval wire-rimmed classes frames, small drop earrings with butterflies and a navy top.

Alexandra Michael

First-year Ph.D. student Alexandra Michael received a fellowship for her work that is co-advised by Allen School professors David Kohlbrenner in the Security and Privacy Research Lab and Dan Grossman in the Programming Languages and Software Engineering (PLSE) group.

Michael’s research combines her interests in security, programming languages and compilers. Prior to graduate school, Michael was fascinated by how computers could connect people yet put them at risk. Her work focuses on mitigating those risks by leveraging programming languages and security tools to improve the security and privacy of systems and the people who use them. She proposes to build a highly performant, secure and portable low-level language that will act as target for programs written in unsafe languages.

Tina Yeung at the WebConf 2023

Tina Yeung presenting "Online Advertising in Ukraine and Russia During the 2022 Russian Invasion" at the ACM WebConf 2023
Tina Yeung presenting “Online Advertising in Ukraine and Russia During the 2022 Russian Invasion” at the ACM WebConf 2023

Congratulations to Tina Yeung for a great talk at the ACM Web Conference 2023 in Austin, Texas! Tina presented the paper “Online Advertising in Ukraine and Russia During the 2022 Russian Invasion“, which you can read more about here. And even more congratulations to Tina and her co-authors for having the paper selected as a “Spotlight Paper” for the conference, which means it was nominated for the Best Paper Award and received a longer (20-minute) presentation slot.

Test of Time Award for NSDI 2012 work on web tracking

(Cross-posted from Allen School News, by Kristin Osborne)

Tadayoshi Kohno and Franziska Roesner smiling and standing side by side, hands clasped in front of them, against a wall painted with visible brush strokes in shades of blue, both wearing lanyards with NSDI name tags around their necks. Kohno is wearing a grey zip-up sweatshirt over a purple t-shirt, and Roesner is wearing a blue floral-patterned blouse with the sleeves rolled up and a smartwatch with a blue wristband.
Tadayoshi Kohno (left) and Franziska Roesner at NSDI 2023. Photo by Liz Markel, courtesy of USENIX

There was a time when cookies were considered something to be savored — back when chips referred to chocolate rather than silicon. Once “cookies” became synonymous with online tracking, privacy researchers weren’t so sweet on the concept. 

That includes Allen School professors Franziska Roesner and Tadayoshi Kohno, who investigated the online tracking ecosystem for their 2012 paper “Detecting and Defending Against Third-Party Tracking on the Web.” Last month, Roesner, Kohno and co-author David Wetherall, a former Allen School professor who is now a Distinguished Engineer at Google, received the Test of Time Award at the 20th USENIX Symposium on Networked Systems Design and Implementation (NSDI 2023) for their influential work, which offered the first comprehensive evaluation of third-party trackers and their intrusion into people’s activities online. 

The team’s findings informed the nascent policy debate around web privacy that has become all the more relevant with the proliferation of social media and reliance on targeted advertising as a revenue model. They also led to the creation of new tools like Privacy Badger, a browser extension that learns and automatically blocks hidden third-party trackers used by millions of people to protect themselves and their browsing histories online. The work also inspired a significant body of follow-on research, including team members’ subsequent paper that appeared at NSDI 2016 chronicling the increase in both the prevalence of online tracking and the complexity of tracker behavior over time.

“Considering how much time we spend online and the variety of activities we engage in, this type of tracking can yield a lot of information about a person,” said Roesner, a co-director of the Security and Privacy Research Lab at the University of Washington along with Kohno. “That’s even truer today than it was a decade ago, and I’m gratified that our work helped initiate such an important conversation and informed efforts to educate and empower users.”

At the time of the original paper’s release, third-party tracking had started to gain attention in security and privacy circles. But researchers were just nibbling around the edges, for the most part; they had a fragmented understanding of how such trackers worked and their impact on people’s online experience. Roesner — an Allen School Ph.D. student at the time — worked with Kohno and Wetherall to develop a client-side method for detecting and classifying trackers according to how they interact with the browser. They analyzed tracker prevalence and behavior on the top 500 website domains, as identified by the now-defunct web traffic analysis firm Alexa Internet, examining more than 2,000 unique pages.

“We identified 524 unique trackers, some of which had sufficient penetration across popular websites to enable them to capture a significant fraction of a user’s browsing activity — typically around 20%, and in one case, as much as 66%,” Roesner recalled.

Roesner and her colleagues cataloged five types of tracker behavior, varying from the relatively benign, to the opportunistic, to the infuriating. The behaviors spanned analytics that are generally confined to a specific site, Google Analytics being an example; “vanilla” trackers, which rely on third-party storage to track users across sites for the purposes of additional analytics or targeted advertising, such as Doubleclick; forced, which include the dreaded popup or redirect that compels the user to visit its domain; referred, which rely on unique identifiers leaked by other trackers; and personal trackers, which engage in cross-site tracking based on a user’s voluntary visit to its domain in other contexts. Some trackers exhibit a combination of the above.

Despite the existence of multiple tools intended to give users more control, from third-party cookie blockers to “private” browsing mode, the team found those options insufficient for preventing certain trackers from following people across the web while maintaining any semblance of functionality. This was particularly true for popular social widgets by the likes of Facebook, Twitter, LinkedIn, Digg, and others that were embedded on a growing number of sites ranging from news outlets to online storefronts.

Portrait of David Wetherall against a dark building interior, smiling and wearing wireframe glasses and a black zip-up top over a lavender collared shirt.
David Wetherall

“While users could prevent some tracking, that was not the case for social widgets,” noted Roesner. “If a user was logged into a social media site like Facebook, for instance, their activity elsewhere on the web would be tracked — non-anonymously, I would add — even if they didn’t interact with the ‘like’ button embedded on those sites.”

For those who would prefer to cover their tracks while continuing to enjoy the convenience of interacting with social widgets on their terms, Roesner and her collaborators developed ShareMeNot. The browser extension took a bite out of social widgets’ ability to construct browsing profiles of users by only allowing activation of third-party tracking cookies when a user explicitly interacted with the “like,” “share,” or other relevant buttons; if a user visited a site but did not click on the social widgets, ShareMeNot stripped the cookies from any third-party requests to those trackers.

The team worked with an undergraduate research assistant in the lab, Chris Rovillos (B.S., ‘14) to refine ShareMeNot following the paper’s initial publication and address instances of the trackers attempting to circumvent the restrictions on cookies via other means. Instead of just blocking cookies, the new and improved version of the tool blocked tracker buttons altogether. In their place, ShareMeNot inserted local, stand-in versions of the buttons that users could click to either “like” a page directly or load the real button — putting users, not the trackers, in control. Roesner partnered with the nonprofit Electronic Frontier Foundation to incorporate ShareMeNot into the previously mentioned Privacy Badger, which remains an important tool for protecting users from intrusion by third-party trackers to this day.

The team’s work is notable for inspiring not only new technologies but also a new wave of researchers to focus on web tracking. One of those researchers, Umar Iqbal, followed that inspiration all the way to the Allen School.

“This is one of the seminal works in the space of web privacy and security. It had an immense influence on the community, including my own research,” observed Iqbar, a postdoc in the Security and Privacy Research Lab. “I extended several of the techniques proposed in the paper as part of my own doctoral thesis, from the measurement of online trackers, to their characterization, to building defenses. It was, in fact, one of the reasons I decided to pursue a postdoc with Franzi at UW!”

Roesner, Kohno and Wetherall were formally recognized at NSDI 2023 last month in Boston, Massachusetts. Read the research paper here. Read USENIX’s story here.

1 2 3 4 5 29