The UW Security and Privacy Research Lab is excited to welcome two incoming PhD students, who will join us in the fall: Kaiming Cheng and Kentrell Owens. Kaiming will join us from the University of Virginia, where he has been working with Yuan Tian. Kentrell will join us from Carnegie Mellon University, where he has been working with Lorrie Cranor. Welcome, Kaiming and Kentrell!! We are so excited to have you join us, and we sincerely hope that you’ll be able to be in Seattle with us in person in the fall.
Kimberly Ruth, a senior graduating from the University of Washington this spring with bachelor’s degrees in computer engineering and mathematics, has been awarded the College of Engineering’s Dean’s Medal for Academic Excellence. Each year, the college recognizes two graduating students for academic excellence; Ruth’s combination of exemplary grades, rigorous coursework, hands-on research experience, and leadership on campus and off illustrate why she was chosen for the honor.
“We have a very strong program and many of our students are remarkable, but Kimberly stands out even from this select group,” said Allen School director and professor Magdalena Balazinska. “Her drive, leadership, undergraduate research and academic excellence are admirable, and she has only reached the beginning of her potential.”
As a freshman in the Allen School, Ruth set her sights on research right away. During her first quarter on campus, she reached out to professors Tadayoshi Kohno and Franziska Roesner, co-directors of the Security and Privacy Research Lab. Although she had not been on campus very long, Kohno and Roesner decided to interview her for a position as an undergraduate researcher anyway.
“Though we met with several other promising undergraduates that day, we knew before our meeting with Kimberly even finished that she stood out far above the rest,” recalled Kohno. “She has now been working with us since January of 2016, and her work in the past four and a half years has only strengthened that initial impression.”
Ruth’s research focuses on security and privacy for augmented reality (AR) platforms. These emerging technologies, such as Microsoft’s HoloLens, generate visual and audio feedback to change a person’s perception of the real world. They also raise new privacy and security risks for users. While working in the Security and Privacy Research Lab, Ruth played a critical role in several research projects. In one project, Ruth worked with Ph.D. student Kiron Lebeck to design an AR operating system that can protect against malicious or buggy output from applications. Ruth was second author on the resulting paper, “Arya: Operating System Support for Securely Augmenting Reality,” which appeared at the 38th IEEE Symposium on Security and Privacy and was published in the IEEE Security and Privacy magazine in 2017. Ruth followed that up by co-authoring “Securing Augmented Reality Output,” and “Towards Security and Privacy for Multi-user Augmented Reality: Foundations with End Users” and the following year.
But that wasn’t quite enough for Ruth, who has made the most of her undergraduate research experience. In June of 2017, she also began leading her own project in AR security, focusing on security for multiuser AR applications like the popular game Pokémon Go. The result was ShareAR, a toolkit that helps app developers build in collaborative and interactive features without sacrificing user privacy and security. Ruth and the team published their paper, “Secure Multi-User Content Sharing for Augmented Reality Applications,” last year at the 28th USENIX Security Symposium, where she presented the results.
Ruth, presenting her research at the 28th USENIX Security Symposium
“Kimberly’s work on this project was incredible. She independently raised, explored, prioritized, and answered a range of sophisticated research questions,” said Roesner. “She worked through design questions and implementation subtleties that were not only technically but also intellectually challenging—requiring thoughtful framing of the problem space and inventing new approaches.”
Outside of the lab, Ruth is also an adept teacher, helping her fellow students to succeed as a peer tutor for the Allen School’s Foundations in Computing course last year and inspiring the next generation through Go Figure, an initiative she founded to ignite middle school students’ interest in math.
“Kimberly is wholly deserving of all of the honors she has received, and I feel so privileged to have had the opportunity to work with her in this early stage of her career,” said Roesner. “I look forward to seeing all of the great things she will do in the future, whether in computer security research or otherwise.”
Some apps highlight when a person is online — and then share that information with their followers. When a user logs in to a website or app that uses online status indicators, a little green (or orange or blue) dot pops up to alert their followers that they’re currently online.
Researchers at the University of Washington wanted to know if people recognize that they are sharing this information and whether these indicators change how people behave online.
UW researchers found that many people misunderstand online status indicators but still carefully shape their behavior to control how they are displayed to others. Camille Cobb
After surveying smartphone users, the team found that many people misunderstand online status indicators but still carefully shape their behavior to control how they are displayed to others. More than half of the participants reported that they had suspected that someone had noticed their status. Meanwhile, over half reported logging on to an app just to check someone else’s status. And 43% of participants discussed changing their settings or behavior because they were trying to avoid one specific person.
These results will be published in the Proceedings of the 2020 ACM CHI conference on Human Factors in Computing Systems.
“Online status indicators are an unusual mechanism for broadcasting information about yourself to other people,” said senior author Alexis Hiniker, an assistant professor in the UW Information School. “When people share information by posting or liking something, the user is in control of that broadcast. But online status indicators are sharing information without taking explicit direction from the user. We believe our results are especially intriguing in light of the coronavirus pandemic: With people’s social lives completely online, what is the role of online status indicators?”
People need to be aware of everything they are sharing about themselves online, the researchers said.
“Practicing good online security and privacy hygiene isn’t just a matter of protecting yourself from skilled technical adversaries,” said lead author Camille Cobb, a postdoctoral researcher at Carnegie Mellon University who completed this research as a UW doctoral student in the Paul G. Allen School of Computer Science & Engineering. “It also includes thinking about how your online presence allows you to craft the identities that you want and manage your interpersonal relationships. There are tools to protect you from malware, but you can’t really download something to protect you from your in-laws.”
The team recruited 200 participants ages 19 to 64 through Amazon Mechanical Turk to fill out an online survey. Over 90% of the participants were from the U.S., and almost half of them had completed a bachelor’s degree.
The researchers asked participants to identify apps that they use from a list of 44 that have online status indicators. The team then asked participants if those apps broadcast their online status to their network. Almost 90% of participants correctly identified that at least one of the apps they used had online status indicators. But for at least one app they used, 62.5% answered “not sure” and 35.5% answered “no.” For example, of the 60 people who said they use Google Docs regularly, 40% said it didn’t have online status indicators and 28% were not sure.
Then the researchers asked the participants to time themselves while they located the settings to turn off “appearing online” in each app they used regularly. For the apps that have settings, participants gave up before they found the settings 28% of the time. For apps that don’t have these settings, such as WhatsApp, participants mistakenly thought they had turned the settings off 23% of the time.
“When you put some of these pieces together, you’re seeing that more than a third of the time, people think they’re not broadcasting information that they actually are,” Cobb said. “And then even when they’re told: ‘Please go try and turn this off,’ they’re still not able to find it more than a quarter of the time. Just broadly we’re seeing that people don’t have a lot of control over whether they share this information with their network.”
Here’s one way the team says designers could help people have more control over whether to broadcast their online status. Cobb et al./ Proceedings of the 2020 ACM CHI conference on Human Factors in Computing Systems
Finally the team asked participants a series of questions about their own experiences online. These questions touched on whether participants noticed when others were online, if they thought others noticed when they were online and whether they had changed their own behavior because they did or didn’t want to appear online.
“We see this repeated pattern of people adjusting themselves to meet the demands of technology — as opposed to technology adapting to us and meeting our needs,” said co-author Lucy Simko, a UW doctoral student in the Allen School. “That means people are choosing to go online not because they want to do something there but because it’s important that their status indicator is projecting the right thing at the right time.”
Now that most states have put stay-at-home orders in place to try to combat the coronavirus pandemic, many people are working from home and socializing only online. This could change how people use online status indicators, the team says. For example, employees can use their online status to indicate that they are working and available for meetings. Or people can use a family member’s “available” status as an opportunity to check up on them and make sure they are OK.
“Right now, when a lot of people are working remotely, I think there’s an opportunity to think about how future evolutions of this technology can help create a sense of community,” Cobb said. “For example, in the real world, you can have your door cracked open and that means ‘interrupt me if you have to,’ you can have it wide open to say ‘come on in’ or you can have your door closed and you theoretically won’t get disturbed. That kind of nuance is not really available in online status indicators. But we need to have a sense of balance — to create community in a way that doesn’t compromise people’s privacy, share people’s statuses when they don’t want to or allow their statuses to be abused.”
Tadayoshi Kohno, a professor in the Allen School, is also a co-author on this paper. This research was funded by the UW Tech Policy Lab.
That statement might hold true for a baseball field in rural Iowa — in the days before social distancing, that is — but what about when it comes to building mobile technologies to fight a global pandemic?
In the balance between individual civil liberties and the common good, there is an obvious tension between the urge to deploy the latest, greatest tools for tracking the spread of COVID-19 and the preservation of personal privacy. But according to a team of researchers and technologists affiliated with the Paul G. Allen School of Computer Science & Engineering, UW Medicine and Microsoft, there is a way to build technology that respects the individual and their civil liberties while supporting public health objectives and saving people’s lives.
In a white paper released yesterday, the team proposes a comprehensive set of principles to guide the development of mobile tools for contact tracing and population-level disease tracking while mitigating security and privacy risks. The researchers refer to these principles as PACT, short for “Privacy Sensitive Protocols and Mechanisms for Mobile Contact Tracing.”
“Contact tracing is one of the most effective tools that public health officials have to halt a pandemic and prevent future breakouts,” explained professor Sham Kakade, who holds a joint appointment in the Allen School and the UW Department of Statistics. “The protocols in PACT are specified in a transparent manner so the tradeoffs can be scrutinized by academia, industry, and civil liberties organizations. PACT permits a more frank evaluation of the underlying privacy, security, and re-identification issues, rather than sweeping these issues under the rug.”
If people were not familiar with the concept of contact tracing before, they surely are now with the outbreak of COVID-19. Public health officials have been relying heavily on the process to identify individuals who may have been exposed through proximity to an infected person to try and halt further spread of the disease. Several governments and organizations have deployed technology to assist with their response; depending on the situation, participation may be voluntary or involuntary. Whether optional or not, the increased use of technology to monitor citizens’ movements and identify other people with whom they meet has rightly sparked concerns around mass surveillance and a loss of personal privacy.
The cornerstone of the PACT framework put forward by the UW researchers is a third-party free approach, which Kakade and his colleagues argue is preferable to a “trusted third party” (TTP) model such as that used for apps administered by government agencies. Under PACT, strict user privacy and anonymity standards stem from a decentralized approach to data storage and collection. The typical TTP model, on the other hand, involves a centralized registration process wherein users subscribe to a service. While this can be a straightforward approach and is one that will be very familiar to users, it also centrally aggregates personally sensitive information that could potentially be accessed by malicious actors. This aggregation also grants the party in question — in this case, a government agency — the ability to identify individual users and to engage in mass surveillance.
The team’s white paper lays out in detail how mobile technologies combined with a third-party free approach can be used to improve the speed, accuracy, and outcomes of contact tracing while mitigating privacy concerns and preserving civil liberties. These include the outline of an app for conducting “privacy-sensitive” mobile contact tracing that relies on Bluetooth-based proximity detection to identify instances of co-location — that is, instances of two phones in proximity, via their pseudonyms — to determine who may be at risk. The team prefers co-location to absolute location information because it is more accurate than current GPS localization technologies, such as those in popular mapping and navigation apps, while affording more robust privacy protections to the user. Depending on the nature of the specific app, such a system could be useful in allowing people who test positive for the disease to securely broadcast information under a pseudonym to other app users who were in close proximity to them, without having to reveal their identity or that of the recipients.
Another example of how PACT can aid in the pandemic response include mobile-assisted contact tracing interviews. In this scenario, a person who tests positive completes a form on their smartphone listing their contacts in advance of the interview; the data remains on the person’s device until they choose to share it with public health officials. The team also describes a system for enabling narrowcast messages, which are public service messages pushed out from a government agency to a subset of the citizenry. Such communications might be used to inform people living in a specific area of local facility closures due to an outbreak, or to notify them in the event that they were at a location during the same time frame as a person who subsequently tested positive for the disease.
Illustration of the PACT tracing protocol. M Eifler
In all cases, the researchers advocate for retaining data locally on the person’s device until they initiate a transfer.
“Only with appropriate disclosures and voluntary action on the part of the user should their data be uploaded to external servers or shared with others — and even then, only in an anonymized fashion,” explained Allen School professor Shyam Gollakota. “We consider it a best practice to have complete transparency around how and where such data is used, as well as full disclosure of the risks of re-identification from previously anonymized information once it is shared.”
Gollakota and his colleagues emphasize that technology-enabled contact tracing can only augment — not entirely replace — conventional contact tracing. In fact, two out of the three applications they describe are designed to support the latter and were developed with input from public health organizations and from co-author Dr. Jacob Sunshine of UW Medicine. There is also the simple fact that, despite their seeming ubiquity, not everyone has a smartphone; of those who do, not everyone would opt to install and use a contact-tracing app.
As Allen School professor and cryptography expert Stefano Tessaro notes, all contact tracing — whether conventional or augmented with technology — involves tradeoffs between privacy and the public good.
“Contact tracing already requires a person to give up some measure of personal privacy, as well as the privacy of those they came into contact with,” Tessaro pointed out. “However, we can make acceptable tradeoffs to enable us to use the best tools available to speed up and improve that process, while ensuring at the same time meaningful privacy guarantees, as long as the people creating and implementing those tools adhere to the PACT.”