Professor Franziska Roesner honored for outstanding engagement with undergraduate researchers

(Cross-posted from Allen School News.)

Allen School professor Franziska Roesner has earned an Undergraduate Research Mentor Award from the University of Washington. This honor recognizes her commitment to guiding undergraduate researchers to achieve success as research scholars. Students presenting their work at the annual Undergraduate Research Symposium were invited to nominate their mentors for this award and a committee selected the honorees. This year, five out of 188 nominated mentors were chosen. 

Roesner, co-director of the Security and Privacy Research Lab, mentors eight undergraduate researchers on her team. Savanna Yee, a fifth year undergraduate in the lab, said Roesner’s affable personality made working in the lab less intimidating. 

“Franzi is wonderful to work with. She’s very approachable, and really cares about prioritizing the goals of the undergrad students and makes sure to check in with us frequently,” Yee said. “When I first started working with Franzi I didn’t expect to have so much direct contact with a faculty member, but I am so glad that she makes time to check in with us and really get to know us as individuals. Franzi is honest, and open about her imperfections and struggles, and I really appreciate this because sometimes, when working with an expert leader in a field, we hold them up on a pedestal. But Franzi is so real about being a regular person, and this makes me very comfortable.”

Roesner attributes her passion for undergraduate research mentorship to her own early exposure to it at the University of Texas at Austin, from her professor at the time, Doug Burger

“The only reason that my own career even followed this path is because I had an amazing undergraduate research mentor, so I am trying to pay it forward,” she said.

Kimberly Ruth, who is also a fifth year senior in the Security and Privacy Research Lab, said Roesner’s support is inspiring.

“Franzi is an extraordinarily supportive mentor. She empowers me to be a meaningful contributor in project planning and implementation, giving me ample room to grow and contribute. Her communication is always clear, prompt, and friendly,” Ruth said. “Even amidst a busy faculty schedule, she always takes time to comment thoughtfully on works in progress: anything from a brainstormed list of ideas to a section of an academic paper in preparation to a research scholarship application essay. With her guidance and feedback, I’ve taken on increasing levels of autonomy and responsibility in my work, becoming increasingly self-sufficient and skilled as a young researcher. She’s given helpful advice at career decision points I’ve faced, sharing anecdotes that advise and reassure. I feel incredibly lucky to have Franzi as my mentor.”

Roesner, whose research spans a number of projects related to privacy and security in emerging technologies, said that developing research proficiency as an undergraduate is invaluable.  

“I think the skills you learn in doing research are valuable beyond that specific field, or even a research-focused career path,” Roesner said. “You learn how to identify important problems, how to make concrete progress in the face of vast uncertainty about where to even begin or how to evaluate success, how to pick up new skills and knowledge as needed to solve your problem, how to collaborate and ask questions, how to grow from failure, and so on.”

Provost Mark Richards and Dean and Vice Provost for Undergraduate Academic Affairs Ed Taylor recognized the awardees in a recorded video message today before this year’s virtual symposium.

Congratulations, Franzi — and thank you for being an extraordinary mentor to our students!

Congratulations and Welcome, Dr. Emami-Naeini!

Congratulations to incoming Security Lab postdoc Pardis Emami-Naeini for successfully defending her dissertation at CMU this week! Pardis’s PhD work at CMU has focused on tools and methods to better inform people’s privacy and security decision-making in the Internet of Things (IoT), advised by Lorrie Cranor and Yuvraj Agarwal. Pardis will be joining the UW Security and Privacy Research Lab in September. We look forward to welcoming you to Seattle (hopefully soon in person), Pardis!

Savanna Yee Named One of Husky 100

Security Lab undergraduate researcher Savanna Yee has been recognized as one of the Husky 100, a program that recognizes students from across the University of Washington’s three campuses who are making the most of their Husky Experience.

Quoting from Allen School News:

Savanna Yee is a computer science and informatics major with a focus on human-centered interaction and is in the interdisciplinary honors program. She is in her fifth year as an undergraduate and starting the B.S./M.S. program. She has had four internships, with two more coming up and has worked as a TA and as a researcher in the Security and Privacy Lab for more than a year.

While serving as a mentor/tutor on the Pipeline Project, Yee learned to be a more empathetic leader. After a tragic loss during her junior year, she used what she learned from working through her pain to help others. Reflecting on her own vulnerability, Yee reached out to the Allen School community to encourage everyone to be more open about their own struggles. She created a panel discussion where students, staff and faculty of the Allen School could talk about their failures and vulnerabilities and how they overcame the obstacles. She also joined Unite UW, an organization helping to build a bridge between domestic and international students. She volunteers as a peer advisor in the Allen School, serves on the student advisory council and was an officer last year for the UW Association for Computing Machinery for Women.

“Mentor, maker, teacher, performer, advisor, advocate, researcher, event organizer. I’ve constantly lost and found myself here, uncertainty is something I’ve learned not to fear,” Yee said poetically. “Here I’ve gained new perspectives, been inspired by brilliance, opened up about depression, healing, resilience. U-Dub has fueled my interdisciplinary mind, always enticing me with more connections to find. By combining technology, ethics, wellbeing, and art, this is how I’ll empower people–or at least how I’ll start.”

Congratulations, Savanna!

Welcome Incoming PhD Students!

The UW Security and Privacy Research Lab is excited to welcome two incoming PhD students, who will join us in the fall: Kaiming Cheng and Kentrell Owens. Kaiming will join us from the University of Virginia, where he has been working with Yuan Tian. Kentrell will join us from Carnegie Mellon University, where he has been working with Lorrie Cranor. Welcome, Kaiming and Kentrell!! We are so excited to have you join us, and we sincerely hope that you’ll be able to be in Seattle with us in person in the fall.

Kaiming Cheng
Kentrell Owens

Security Lab senior Kimberly Ruth awarded College of Engineering Dean’s Medal

(Cross-posted from Allen School News.)

Kimberly Ruth, a senior graduating from the University of Washington this spring with bachelor’s degrees in computer engineering and mathematics, has been awarded the College of Engineering’s Dean’s Medal for Academic Excellence. Each year, the college recognizes two graduating students for academic excellence; Ruth’s combination of exemplary grades, rigorous coursework, hands-on research experience, and leadership on campus and off illustrate why she was chosen for the honor.

“We have a very strong program and many of our students are remarkable, but Kimberly stands out even from this select group,” said Allen School director and professor Magdalena Balazinska. “Her drive, leadership, undergraduate research and academic excellence are admirable, and she has only reached the beginning of her potential.”

As a freshman in the Allen School, Ruth set her sights on research right away. During her first quarter on campus, she reached out to professors Tadayoshi Kohno and Franziska Roesner, co-directors of the Security and Privacy Research Lab. Although she had not been on campus very long, Kohno and Roesner decided to interview her for a position as an undergraduate researcher anyway.

“Though we met with several other promising undergraduates that day, we knew before our meeting with Kimberly even finished that she stood out far above the rest,” recalled Kohno. “She has now been working with us since January of 2016, and her work in the past four and a half years has only strengthened that initial impression.”

Ruth’s research focuses on security and privacy for augmented reality (AR) platforms. These emerging technologies, such as Microsoft’s HoloLens, generate visual and audio feedback to change a person’s perception of the real world. They also raise new privacy and security risks for users. While working in the Security and Privacy Research Lab, Ruth played a critical role in several research projects. In one project, Ruth worked with Ph.D. student Kiron Lebeck to design an AR operating system that can protect against malicious or buggy output from applications. Ruth was second author on the resulting paper, “Arya: Operating System Support for Securely Augmenting Reality,”  which appeared at the 38th IEEE  Symposium on Security and Privacy and was published in the IEEE Security and Privacy magazine in 2017. Ruth followed that up by co-authoring “Securing Augmented Reality Output,” and “Towards Security and Privacy for Multi-user Augmented Reality: Foundations with End Users” and the following year.

But that wasn’t quite enough for Ruth, who has made the most of her undergraduate research experience. In June of 2017, she also began leading her own project in AR security, focusing on security for multiuser AR applications like the popular game Pokémon Go. The result was ShareAR, a toolkit that helps app developers build in collaborative and interactive features without sacrificing user privacy and security. Ruth and the team published their paper, “Secure Multi-User Content Sharing for Augmented Reality Applications,” last year at the 28th USENIX Security Symposium, where she presented the results.

Ruth, presenting her research at the 28th USENIX Security Symposium

“Kimberly’s work on this project was incredible. She independently raised, explored, prioritized, and answered a range of sophisticated research questions,” said Roesner. “She worked through design questions and implementation subtleties that were not only technically but also intellectually challenging—requiring thoughtful framing of the problem space and inventing new approaches.”

Outside of the lab, Ruth is also an adept teacher, helping her fellow students to succeed as a peer tutor for the Allen School’s Foundations in Computing course last year and inspiring the next generation through Go Figure, an initiative she founded to ignite middle school students’ interest in math.

“Kimberly is wholly deserving of all of the honors she has received, and I feel so privileged to have had the opportunity to work with her in this early stage of her career,” said Roesner. “I look forward to seeing all of the great things she will do in the future, whether in computer security research or otherwise.”

In addition to being a Dean’s Medalist, Ruth previously earned the Lisa Simonyi Prize, a 2018 Goldwater Scholarship (Kimberly’s brother Parker, also an extraordinary Allen School senior, received a Goldwater in 2020), finalist standing in the Computing Research Association’s Undergraduate Researcher Award competition in both 2018 and 2019, Washington Research Foundation Fellowships for 2017, 2018 and 2019, and most recently a 2020 National Science Foundation Graduate Fellowship Program. In 2018 she was recognized as a member of the Husky 100, which celebrates UW students who are making the most of their “Husky Experience.” This fall she’ll be pursuing her Ph.D. at Stanford, focusing on computer security and privacy.

Congratulations, Kimberly, and thank you for your commitment to excellence inside and outside of the Allen School!

‘I saw you were online’: How online status indicators shape our behavior

(Cross-posted from UW News, by Sarah McQuate)

Some apps highlight when a person is online — and then share that information with their followers. When a user logs in to a website or app that uses online status indicators, a little green (or orange or blue) dot pops up to alert their followers that they’re currently online.

Researchers at the University of Washington wanted to know if people recognize that they are sharing this information and whether these indicators change how people behave online.

A graphic showing the online status of four people -- Alice who is online at work, Bob who is offline, Carol who is online but has changed her status to appear offline to avoid Malory, and Malory who is waiting for Carol to get online to ask for a favor.

UW researchers found that many people misunderstand online status indicators but still carefully shape their behavior to control how they are displayed to others. Camille Cobb

After surveying smartphone users, the team found that many people misunderstand online status indicators but still carefully shape their behavior to control how they are displayed to others. More than half of the participants reported that they had suspected that someone had noticed their status. Meanwhile, over half reported logging on to an app just to check someone else’s status. And 43% of participants discussed changing their settings or behavior because they were trying to avoid one specific person.

These results will be published in the Proceedings of the 2020 ACM CHI conference on Human Factors in Computing Systems.

“Online status indicators are an unusual mechanism for broadcasting information about yourself to other people,” said senior author Alexis Hiniker, an assistant professor in the UW Information School. “When people share information by posting or liking something, the user is in control of that broadcast. But online status indicators are sharing information without taking explicit direction from the user. We believe our results are especially intriguing in light of the coronavirus pandemic: With people’s social lives completely online, what is the role of online status indicators?”

People need to be aware of everything they are sharing about themselves online, the researchers said.

“Practicing good online security and privacy hygiene isn’t just a matter of protecting yourself from skilled technical adversaries,” said lead author Camille Cobb, a postdoctoral researcher at Carnegie Mellon University who completed this research as a UW doctoral student in the Paul G. Allen School of Computer Science & Engineering. “It also includes thinking about how your online presence allows you to craft the identities that you want and manage your interpersonal relationships. There are tools to protect you from malware, but you can’t really download something to protect you from your in-laws.”

The team recruited 200 participants ages 19 to 64 through Amazon Mechanical Turk to fill out an online survey. Over 90% of the participants were from the U.S., and almost half of them had completed a bachelor’s degree.

The researchers asked participants to identify apps that they use from a list of 44 that have online status indicators. The team then asked participants if those apps broadcast their online status to their network. Almost 90% of participants correctly identified that at least one of the apps they used had online status indicators. But for at least one app they used, 62.5% answered “not sure” and 35.5% answered “no.” For example, of the 60 people who said they use Google Docs regularly, 40% said it didn’t have online status indicators and 28% were not sure.

Then the researchers asked the participants to time themselves while they located the settings to turn off “appearing online” in each app they used regularly. For the apps that have settings, participants gave up before they found the settings 28% of the time. For apps that don’t have these settings, such as WhatsApp, participants mistakenly thought they had turned the settings off 23% of the time.

“When you put some of these pieces together, you’re seeing that more than a third of the time, people think they’re not broadcasting information that they actually are,” Cobb said. “And then even when they’re told: ‘Please go try and turn this off,’ they’re still not able to find it more than a quarter of the time. Just broadly we’re seeing that people don’t have a lot of control over whether they share this information with their network.”

a graphic of an online status indicator that gives users a countdown to see when they will appear online, and an easy access button to change their status

Here’s one way the team says designers could help people have more control over whether to broadcast their online status. Cobb et al./ Proceedings of the 2020 ACM CHI conference on Human Factors in Computing Systems

Finally the team asked participants a series of questions about their own experiences online. These questions touched on whether participants noticed when others were online, if they thought others noticed when they were online and whether they had changed their own behavior because they did or didn’t want to appear online.

“We see this repeated pattern of people adjusting themselves to meet the demands of technology — as opposed to technology adapting to us and meeting our needs,” said co-author Lucy Simko, a UW doctoral student in the Allen School. “That means people are choosing to go online not because they want to do something there but because it’s important that their status indicator is projecting the right thing at the right time.”

Now that most states have put stay-at-home orders in place to try to combat the coronavirus pandemic, many people are working from home and socializing only online. This could change how people use online status indicators, the team says. For example, employees can use their online status to indicate that they are working and available for meetings. Or people can use a family member’s “available” status as an opportunity to check up on them and make sure they are OK.

“Right now, when a lot of people are working remotely, I think there’s an opportunity to think about how future evolutions of this technology can help create a sense of community,” Cobb said. “For example, in the real world, you can have your door cracked open and that means ‘interrupt me if you have to,’ you can have it wide open to say ‘come on in’ or you can have your door closed and you theoretically won’t get disturbed. That kind of nuance is not really available in online status indicators. But we need to have a sense of balance — to create community in a way that doesn’t compromise people’s privacy, share people’s statuses when they don’t want to or allow their statuses to be abused.”

Tadayoshi Kohno, a professor in the Allen School, is also a co-author on this paper. This research was funded by the UW Tech Policy Lab.

For more information, contact Hiniker at alexisr@uw.edu, Cobb at ccobb@andrew.cmu.edu, Simko at simkol@cs.washington.edu and Kohno at yoshi@cs.washington.edu.

Privacy and the pandemic: UW and Microsoft researchers present a “PACT” for using technology to fight the spread of COVID-19

(Cross-posted from Allen School News.)

If you build it, they will come. 

That statement might hold true for a baseball field in rural Iowa — in the days before social distancing, that is — but what about when it comes to building mobile technologies to fight a global pandemic? 

In the balance between individual civil liberties and the common good, there is an obvious tension between the urge to deploy the latest, greatest tools for tracking the spread of COVID-19 and the preservation of personal privacy. But according to a team of researchers and technologists affiliated with the Paul G. Allen School of Computer Science & Engineering, UW Medicine and Microsoft, there is a way to build technology that respects the individual and their civil liberties while supporting public health objectives and saving people’s lives.

In a white paper released yesterday, the team proposes a comprehensive set of principles to guide the development of mobile tools for contact tracing and population-level disease tracking while mitigating security and privacy risks. The researchers refer to these principles as PACT, short for “Privacy Sensitive Protocols and Mechanisms for Mobile Contact Tracing.”

“Contact tracing is one of the most effective tools that public health officials have to halt a pandemic and prevent future breakouts,” explained professor Sham Kakade, who holds a joint appointment in the Allen School and the UW Department of Statistics. “The protocols in PACT are specified in a transparent manner so the tradeoffs can be scrutinized by academia, industry, and civil liberties organizations. PACT permits a more frank evaluation of the underlying privacy, security, and re-identification issues, rather than sweeping these issues under the rug.”

If people were not familiar with the concept of contact tracing before, they surely are now with the outbreak of COVID-19. Public health officials have been relying heavily on the process to identify individuals who may have been exposed through proximity to an infected person to try and halt further spread of the disease. Several governments and organizations have deployed technology to assist with their response; depending on the situation, participation may be voluntary or involuntary. Whether optional or not, the increased use of technology to monitor citizens’ movements and identify other people with whom they meet has rightly sparked concerns around mass surveillance and a loss of personal privacy.

The cornerstone of the PACT framework put forward by the UW researchers is a third-party free approach, which Kakade and his colleagues argue is preferable to a “trusted third party” (TTP) model such as that used for apps administered by government agencies. Under PACT, strict user privacy and anonymity standards stem from a decentralized approach to data storage and collection. The typical TTP model, on the other hand, involves a centralized registration process wherein users subscribe to a service. While this can be a straightforward approach and is one that will be very familiar to users, it also centrally aggregates personally sensitive information that could potentially be accessed by malicious actors. This aggregation also grants the party in question — in this case, a government agency — the ability to identify individual users and to engage in mass surveillance.

The team’s white paper lays out in detail how mobile technologies combined with a third-party free approach can be used to improve the speed, accuracy, and outcomes of contact tracing while mitigating privacy concerns and preserving civil liberties. These include the outline of an app for conducting “privacy-sensitive” mobile contact tracing that relies on Bluetooth-based proximity detection to identify instances of co-location — that is, instances of two phones in proximity, via their pseudonyms — to determine who may be at risk. The team prefers co-location to absolute location information because it is more accurate than current GPS localization technologies, such as those in popular mapping and navigation apps, while affording more robust privacy protections to the user. Depending on the nature of the specific app, such a system could be useful in allowing people who test positive for the disease to securely broadcast information under a pseudonym to other app users who were in close proximity to them, without having to reveal their identity or that of the recipients.

Another example of how PACT can aid in the pandemic response include mobile-assisted contact tracing interviews. In this scenario, a person who tests positive completes a form on their smartphone listing their contacts in advance of the interview; the data remains on the person’s device until they choose to share it with public health officials. The team also describes a system for enabling narrowcast messages, which are public service messages pushed out from a government agency to a subset of the citizenry. Such communications might be used to inform people living in a specific area of local facility closures due to an outbreak, or to notify them in the event that they were at a location during the same time frame as a person who subsequently tested positive for the disease.

Illustration of the PACT tracing protocol. M Eifler

In all cases, the researchers advocate for retaining data locally on the person’s device until they initiate a transfer.

“Only with appropriate disclosures and voluntary action on the part of the user should their data be uploaded to external servers or shared with others — and even then, only in an anonymized fashion,” explained Allen School professor Shyam Gollakota. “We consider it a best practice to have complete transparency around how and where such data is used, as well as full disclosure of the risks of re-identification from previously anonymized information once it is shared.”

Gollakota and his colleagues emphasize that technology-enabled contact tracing can only augment — not entirely replace — conventional contact tracing. In fact, two out of the three applications they describe are designed to support the latter and were developed with input from public health organizations and from co-author Dr. Jacob Sunshine of UW Medicine. There is also the simple fact that, despite their seeming ubiquity, not everyone has a smartphone; of those who do, not everyone would opt to install and use a contact-tracing app. 

As Allen School professor and cryptography expert Stefano Tessaro notes, all contact tracing — whether conventional or augmented with technology — involves tradeoffs between privacy and the public good.

“Contact tracing already requires a person to give up some measure of personal privacy, as well as the privacy of those they came into contact with,” Tessaro pointed out. “However, we can make acceptable tradeoffs to enable us to use the best tools available to speed up and improve that process, while ensuring at the same time meaningful privacy guarantees, as long as the people creating and implementing those tools adhere to the PACT.”

The team, which also includes Allen School Ph.D. students Justin Chan and Sudheesh Singanamalla, postdoctoral researcher Joseph Jaeger, and professor Tadayoshi Kohno — along with the technologists John Langford, Eric Horvitz, and Jonathan Larson at Microsoft — posted its white paper on the preprint site arXiv.org to encourage broad dissemination and conversation around this topic. Read the full paper here.

“Hey, check out this 450-pound dog!” Allen School researchers explore how users interact with bogus social media posts

(Cross-posted from Allen School News.)

Dark, swirling clouds over an aerial shot of Sydney harbor and downtown
Is that a superstorm over Sydney, or fake news?

We’ve all seen the images scrolling through our social media feeds — the improbably large pet that dwarfs the human sitting beside it; the monstrous stormcloud ominously bearing down on a city full of people; the elected official who says or does something outrageous (and outrageously out of character). We might stop mid-scroll and do a double-take, occasionally hit “like” or “share,” or dismiss the content as fake news. But how do we as consumers of information determine what is real and what is fake?

Freakishly large Fido may be fake news — sorry! — but this isn’t: A team of researchers led by professor Franziska Roesner, co-director of the Allen School’s Security and Privacy Research Laboratory, conducted a study examining how and why users investigate and act on fake content shared on their social media feeds. The project, which involved semi-structured interviews with more than two dozen users ranging in age from 18 to 74, aimed to better understand what tools would be most useful to people trying to determine which posts are trustworthy and which are bogus.

In a “think aloud” study in the lab, the researchers asked users to provide a running commentary on their reaction to various posts as they scrolled through their social feeds. Their observations provided the team with insights into the thought process that goes into a user’s decision to dismiss, share, or otherwise engage with fake content they encounter online. Unbeknownst to the participants, the researchers deployed a browser extension that they had built which randomly layered misinformation posts previously debunked by Snopes.com over legitimate posts shared by participants’ Facebook friends and accounts they follow on Twitter.

The artificial posts that populated users’ feeds ranged from the sublime (the aforementioned giant dog), to the ridiculous (“A photograph shows Bernie Sanders being arrested for throwing eggs at civil rights protesters”), to the downright hilarious (“A church sign reads ‘Adultery is a sin. You can’t have your Kate and Edith too’”). As the participants scrolled through the mixture of legitimate and fake posts, Allen School Ph.D. student Christine Geeng and her colleagues would ask them why they chose to engage with or ignore various content. At the end of the experiment, the researchers pointed out the fake posts and informed participants that their friends and contacts had not really shared them. Geeng and her colleagues also noted that participants could not actually like or share the fake content on their real feeds.

“Our goal was not to trick participants or to make them feel exposed,” explained Geeng, lead author of the paper describing the study. “We wanted to normalize the difficulty of determining what’s fake and what’s not.”

Participants employed a variety of strategies in dealing with the misinformation posts as they scrolled through. Many posts were simply ignored at first sight, whether because they were political in nature, required too much time and effort to investigate, or the viewer was simply disinterested in the topic presented. If a post caught their attention, some users investigated further by looking at the name on the account that appeared to have posted it, or read through comments from others before making up their own minds. For others, they might click through to the full article to check if the claim was bogus — such as in the case of the Bernie Sanders photo, which was intentionally miscaptioned in the fake post. Participants also self-reported that, outside of a laboratory setting, they might consult a fact-checking website like Snopes.com, see if trusted news sources were reporting on the same topic, or seek out the opinions of family members or others in their social circle.

The researchers found that users were more likely to employ such ad hoc strategies over purpose-built tools provided by the platforms themselves. For example, none of the study participants used Facebook’s “i” button to investigate fake content; in fact, most said they were unaware of the button’s existence. Whether a matter of functionality or design (or both), the team’s findings suggest there is room for improvement when it comes to offering truly useful tools for people who are trying to separate fact from fiction.

“There are a lot of people who are trying to be good consumers of information and they’re struggling,” said Roesner. “If we can understand what these people are doing, we might be able to design tools that can help them.”

In addition to Roesner and Geeng, Savanna Yee, a fifth-year master’s student in the Allen School, contributed to the project. The team will present its findings at the Association for Computing Machinery’s Conference on Human Factors in Computing Systems (CHI 2020) next month.

Learn more in the UW News release here, and read the research paper here.

1 3 4 5 6 7 28