Security Lab PhD student — now officially PhD Candidate — Eric Zeng passed his General Exam on May 11, 2020. This exam, in the form of a dissertation proposal, is the second of three major milestones on the path to a PhD (between the Quals Exam and the Final Exam). Congratulations Eric!!
Ten years ago, a team of security and privacy researchers at the University of Washington and University of California, San Diego published a paper, “Experimental Security Analysis of a Modern Automobile,” describing how they were able to override critical safety systems and take control of a range of vehicle functions of what was later revealed to be a pair of 2009 Chevy Impalas. That work, which was first presented at the IEEE’s 2010 Symposium on Security and Privacy in Oakland, California, opened up an entirely new avenue of cybersecurity research while serving as a wakeup call to an industry that was more accustomed to guarding against break-ins of the physical, rather than the over-the-air, kind. This week, the IEEE Computer Society Technical Committee on Security and Privacy recalled the significance of the team’s contributions and their enduring impact with its 2020 Test of Time Award.
The project was originally the brainchild of professor Tadayoshi Kohno of the Allen School’s Security and Privacy Research Lab and one of his mentors, UCSD professor and Allen School alumnus Stefan Savage (Ph.D., ‘02). Fresh off the success of Kohno’s 2008 IEEE Symposium on S&P paper examining the security of wireless implantable medical devices — which also later earned a Test of Time Award — he and Savage turned their attention to another technology gaining in popularity: the computerized automobile.
Backed by funding from the National Science Foundation and flexible funds from an Alfred P. Sloan Fellowship, the duo pulled together what they refer to as an “all-star team” of students. The lineup included then Allen School Ph.D. students Karl Koscher, Alexei Czeskis, and Franziska Roesner; and UCSD Ph.D. student Stephen Checkoway, postdoc Damon McCoy, and master’s student Danny Anderson. Checkoway and Anderson were no strangers to UW; the former had earned his bachelor’s from the Allen School and the Department of Mathematics in 2005, while the latter had just graduated with his bachelor’s from the Allen School in 2009. Allen School and UW Department of Electrical & Computer Engineering professor Shwetak Patel and UCSD professor Hovav Shacham joined the leadership team during the formative stages of the project, and UCSD research staff member Brian Kantor rounded out the group.
The team would become the first to drive home to automobile manufacturers, regulators, security experts, and the public the extent to which modern-day vehicles were vulnerable to cyberattacks. According to Savage, one of the main reasons they succeeded in doing so was that the students — all new, or at least, new to this area of research — “didn’t know any better” and were therefore undaunted by the task set for them by their mentors.
“Essentially, we bought two cars and said to the students, here are the keys, go figure it out,” recalled Savage. “To Yoshi’s and my delight, they did. And in the process, they established this entirely new subfield of automotive security research.”
“This was an extremely collaborative effort; no task was performed by an individual researcher alone. I believe our close collaboration was the key to our success,” explained Checkoway, the lead Ph.D. student on the UCSD side who later joined the faculty of Oberlin College. “On a personal level, the large group collaboration was so much fun, that collaborative research has been my preferred method of research ever since.”
It is a theme that is echoed by Checkoway’s colleagues, even 10 years on.
“It was really exciting to join this great team and contribute to such an impactful project at the very beginning of graduate school,” said Roesner (Ph.D., ‘14), now a professor in the Allen School and co-director of the Security and Privacy Research Lab with Kohno. “I had recently decided to switch my focus from computer architecture to security, after discovering that I really liked the ‘security mindset’ of challenging assumptions in designs. This experience and this paper essentially launched my security research career.”
Koscher (Ph.D., ‘14), who has since returned to his old Seattle stomping ground as a research scientist after completing a postdoc at UCSD, was the lead Ph.D. researcher on the UW side.
“We really were one team, and there was definitely enough work to keep everyone on both sides busy.” Koscher recalled. “We at UW would attack a problem from one direction, while the folks at UCSD attacked it from another. Each side brought the puzzle pieces to complete the other.”
Among the puzzles the team needed to piece together was how to access one or more of a vehicle’s electronic control units (ECUs) — the collection of independent computers that communicate across multiple internal networks. At the time, it was estimated that the average luxury sedan contained as many as 70 ECUs running over 150 megabytes of code. This did not comprise the totality of the potential attack surface of a vehicle, however; additional entry points came in the form of the federally-mandated onboard diagnostic system, optional short-range wireless capabilities such as Bluetooth, and telematics such as OnStar with its long-range cellular radio link.
Beginning with 2008 models, all cars sold in the United States were required to implement the Controller Area Network (CAN) bus for diagnostics — making it the dominant in-car communication network not only for GM, but also other major manufacturers such as Ford, BMW, Honda, and Volkswagen. To facilitate the full range of exploits they wanted to explore, Koscher and the team developed CarShark, a custom CAN bus analyzer and packet injection tool.
Using this approach, the team determined that weaknesses in the underlying CAN protocol meant that, by infiltrating almost any one of the vehicle’s ECUs, an attacker would be able to leverage that access to circumvent a broad array of safety-critical systems. In a series of experiments, both in the lab and on the road, the researchers demonstrated the ability to control a variety of vehicle functions while overriding or disabling driver input. They also examined scenarios in which malicious actors could exploit multiple components in a composite attack, including using the telematics unit to bridge multiple ECUs and to inject or wipe malicious code.
Czeskis (Ph.D., ‘13), who is currently a Staff Software Engineer at Google focused on authentication, identity, and protection of high-risk users, recalled both the audacity and novelty of what he and his fellow students were doing — particularly when it came to testing.
“We had to verify that our hypotheses and techniques would hold outside of the lab setting,” he explained. “That meant we often had to drive the car up to the computer science building, lift it on jack stands, and then repeatedly rev the engine and honk the horn for extended periods of time while puzzled students walked by.
“We also needed to test our techniques in a safe, real-world setting, so we took our car to a decommissioned airstrip,” he continued. “That involved signing a waiver acknowledging the ‘possibility of death’ as a graduate student while working on this project! Of course, we had appropriate safety precautions in place. As a motorcycle rider with protective equipment and perhaps a higher tolerance for risk than other members of the team, I ended up being the test driver at the airstrip and other test environments.”
The results of those tests ranged from annoying to downright alarming. For example, the team found through its stationary testing that it could gain control of the radio to deliver audible clicks and chimes at arbitrary intervals. The researchers also gained full control of the Instrument Panel Cluster (IPC), including the speedometer, fuel gauge and other displays, to deliver a message to a hypothetical driver that they had been “Pwned by CarShark.” The team found additional ways to interfere with functions that could compromise driver and passenger safety through an ECU called the Body Control Module. These included locking and unlocking the doors, adjusting or disabling the interior and exterior lighting, operating the windshield wipers, and engaging in the aforementioned horn honking.
While all this sounds frightening enough while stationary, the team demonstrated that they could do these things while the car was moving at 40 miles per hour. They also broke into the Engine Control Module which, as the name suggests, gave them control over the vehicle’s engine. Once they gained access to the ECM, the researchers were able to temporarily boost engine RPM, and even disable the engine completely. But the researchers didn’t stop there; they also infiltrated the Electronic Brake Control Module. That enabled them to lock individual brakes or sets of brakes — a capability they later demonstrated to great effect in a CBS “60 Minutes” segment featuring correspondent Lesley Stahl behind the wheel. They could also release the brakes and then prevent them from subsequently being enabled.
The team knew they were onto something big, but it took a while to figure out who they could go to with their findings. “When we started, we didn’t even know how to get in touch with the right people — if they even existed — at the manufacturer,” Koscher recalled. “It took the industry by complete surprise.”
They eventually did find the right people at GM, opting to initially share their findings directly with the company while declining to “name and shame” them in the paper released to IEEE Symposium on S&P and the public.
“It was clear to us that these vulnerabilities stemmed primarily from the architectureof the modern automobile, not from design decisions made by any single manufacturer,” Kohno explained. “It later came out that our model was from GM, but it was never just about GM. Like so much that we encounter in the security field, this was an industry-wide issue that would require industry-wide solutions.”
Those solutions, which can be directly traced to the UW and UCSD collaboration, include new standards for motor vehicle security, guidelines for original equipment manufacturers (OEMs), and the creation of the Electronic Systems Safety Research Division at the National Highway Traffic Safety Administration. And the impact of the team’s work continues to be felt to this day.
“I think it was a bit unexpected how impactful this work would be,” Koscher said. “Yoshi’s previous work included exploring vulnerabilities in pacemakers and voting machines, but progress had been slow in those industries. It wasn’t clear that automobiles would be any different.
“But it turned out this time was different. Shortly after disclosing the vulnerabilities we found, GM appointed a VP of product security to lead a new division of over 100 employees solely focused on improving the security of their vehicles,” he continued. “In 2012, DARPA announced their $60M+ High-Assurance Cyber Military Systems (HACMS) project, partially inspired by our work. The following year, industry security researchers began to replicate our work. But I think it finally hit me when DEF CON, the world’s largest hacker conference, introduced their Car Hacking Village in 2014.”
In addition to transforming an existing industry, the team’s work has also generated an entirely new one. “Our project spawned dozens of startup companies — and hundreds of jobs — focused on automobile security,” Savage noted.
Following the conclusion of the project, McCoy went on to join the faculty of New York University, while Shacham later left UCSD to join the faculty of the University of Texas at Austin. Anderson launched his own firm, Daniel Anderson Software Consulting, focused on creating independent iOS apps. Kantor later retired from UCSD after more than 30 years of service. He passed away last year.
“This work was really visionary at the time, and it proved to be a game-changer for industry, government, and academia,” Kohno concluded. “I like to think that was due to the high quality of the work, and how thoughtful we were in its execution.”
Allen School professor Franziska Roesner has earned an Undergraduate Research Mentor Award from the University of Washington. This honor recognizes her commitment to guiding undergraduate researchers to achieve success as research scholars. Students presenting their work at the annual Undergraduate Research Symposium were invited to nominate their mentors for this award and a committee selected the honorees. This year, five out of 188 nominated mentors were chosen.
Roesner, co-director of the Security and Privacy Research Lab, mentors eight undergraduate researchers on her team. Savanna Yee, a fifth year undergraduate in the lab, said Roesner’s affable personality made working in the lab less intimidating.
“Franzi is wonderful to work with. She’s very approachable, and really cares about prioritizing the goals of the undergrad students and makes sure to check in with us frequently,” Yee said. “When I first started working with Franzi I didn’t expect to have so much direct contact with a faculty member, but I am so glad that she makes time to check in with us and really get to know us as individuals. Franzi is honest, and open about her imperfections and struggles, and I really appreciate this because sometimes, when working with an expert leader in a field, we hold them up on a pedestal. But Franzi is so real about being a regular person, and this makes me very comfortable.”
Roesner attributes her passion for undergraduate research mentorship to her own early exposure to it at the University of Texas at Austin, from her professor at the time, Doug Burger.
“The only reason that my own career even followed this path is because I had an amazing undergraduate research mentor, so I am trying to pay it forward,” she said.
Kimberly Ruth, who is also a fifth year senior in the Security and Privacy Research Lab, said Roesner’s support is inspiring.
“Franzi is an extraordinarily supportive mentor. She empowers me to be a meaningful contributor in project planning and implementation, giving me ample room to grow and contribute. Her communication is always clear, prompt, and friendly,” Ruth said. “Even amidst a busy faculty schedule, she always takes time to comment thoughtfully on works in progress: anything from a brainstormed list of ideas to a section of an academic paper in preparation to a research scholarship application essay. With her guidance and feedback, I’ve taken on increasing levels of autonomy and responsibility in my work, becoming increasingly self-sufficient and skilled as a young researcher. She’s given helpful advice at career decision points I’ve faced, sharing anecdotes that advise and reassure. I feel incredibly lucky to have Franzi as my mentor.”
Roesner, whose research spans a number of projects related to privacy and security in emerging technologies, said that developing research proficiency as an undergraduate is invaluable.
“I think the skills you learn in doing research are valuable beyond that specific field, or even a research-focused career path,” Roesner said. “You learn how to identify important problems, how to make concrete progress in the face of vast uncertainty about where to even begin or how to evaluate success, how to pick up new skills and knowledge as needed to solve your problem, how to collaborate and ask questions, how to grow from failure, and so on.”
Provost Mark Richards and Dean and Vice Provost for Undergraduate Academic Affairs Ed Taylor recognized the awardees in a recorded video message today before this year’s virtual symposium.
Congratulations, Franzi — and thank you for being an extraordinary mentor to our students!
Congratulations to incoming Security Lab postdoc Pardis Emami-Naeini for successfully defending her dissertation at CMU this week! Pardis’s PhD work at CMU has focused on tools and methods to better inform people’s privacy and security decision-making in the Internet of Things (IoT), advised by Lorrie Cranor and Yuvraj Agarwal. Pardis will be joining the UW Security and Privacy Research Lab in September. We look forward to welcoming you to Seattle (hopefully soon in person), Pardis!
Security Lab undergraduate researcher Savanna Yee has been recognized as one of the Husky 100, a program that recognizes students from across the University of Washington’s three campuses who are making the most of their Husky Experience.
Savanna Yee is a computer science and informatics major with a focus on human-centered interaction and is in the interdisciplinary honors program. She is in her fifth year as an undergraduate and starting the B.S./M.S. program. She has had four internships, with two more coming up and has worked as a TA and as a researcher in the Security and Privacy Lab for more than a year.
While serving as a mentor/tutor on the Pipeline Project, Yee learned to be a more empathetic leader. After a tragic loss during her junior year, she used what she learned from working through her pain to help others. Reflecting on her own vulnerability, Yee reached out to the Allen School community to encourage everyone to be more open about their own struggles. She created a panel discussion where students, staff and faculty of the Allen School could talk about their failures and vulnerabilities and how they overcame the obstacles. She also joined Unite UW, an organization helping to build a bridge between domestic and international students. She volunteers as a peer advisor in the Allen School, serves on the student advisory council and was an officer last year for the UW Association for Computing Machinery for Women.
“Mentor, maker, teacher, performer, advisor, advocate, researcher, event organizer. I’ve constantly lost and found myself here, uncertainty is something I’ve learned not to fear,” Yee said poetically. “Here I’ve gained new perspectives, been inspired by brilliance, opened up about depression, healing, resilience. U-Dub has fueled my interdisciplinary mind, always enticing me with more connections to find. By combining technology, ethics, wellbeing, and art, this is how I’ll empower people–or at least how I’ll start.”
The UW Security and Privacy Research Lab is excited to welcome two incoming PhD students, who will join us in the fall: Kaiming Cheng and Kentrell Owens. Kaiming will join us from the University of Virginia, where he has been working with Yuan Tian. Kentrell will join us from Carnegie Mellon University, where he has been working with Lorrie Cranor. Welcome, Kaiming and Kentrell!! We are so excited to have you join us, and we sincerely hope that you’ll be able to be in Seattle with us in person in the fall.
Kimberly Ruth, a senior graduating from the University of Washington this spring with bachelor’s degrees in computer engineering and mathematics, has been awarded the College of Engineering’s Dean’s Medal for Academic Excellence. Each year, the college recognizes two graduating students for academic excellence; Ruth’s combination of exemplary grades, rigorous coursework, hands-on research experience, and leadership on campus and off illustrate why she was chosen for the honor.
“We have a very strong program and many of our students are remarkable, but Kimberly stands out even from this select group,” said Allen School director and professor Magdalena Balazinska. “Her drive, leadership, undergraduate research and academic excellence are admirable, and she has only reached the beginning of her potential.”
As a freshman in the Allen School, Ruth set her sights on research right away. During her first quarter on campus, she reached out to professors Tadayoshi Kohno and Franziska Roesner, co-directors of the Security and Privacy Research Lab. Although she had not been on campus very long, Kohno and Roesner decided to interview her for a position as an undergraduate researcher anyway.
“Though we met with several other promising undergraduates that day, we knew before our meeting with Kimberly even finished that she stood out far above the rest,” recalled Kohno. “She has now been working with us since January of 2016, and her work in the past four and a half years has only strengthened that initial impression.”
But that wasn’t quite enough for Ruth, who has made the most of her undergraduate research experience. In June of 2017, she also began leading her own project in AR security, focusing on security for multiuser AR applications like the popular game Pokémon Go. The result was ShareAR, a toolkit that helps app developers build in collaborative and interactive features without sacrificing user privacy and security. Ruth and the team published their paper, “Secure Multi-User Content Sharing for Augmented Reality Applications,” last year at the 28th USENIX Security Symposium, where she presented the results.
“Kimberly’s work on this project was incredible. She independently raised, explored, prioritized, and answered a range of sophisticated research questions,” said Roesner. “She worked through design questions and implementation subtleties that were not only technically but also intellectually challenging—requiring thoughtful framing of the problem space and inventing new approaches.”
Outside of the lab, Ruth is also an adept teacher, helping her fellow students to succeed as a peer tutor for the Allen School’s Foundations in Computing course last year and inspiring the next generation through Go Figure, an initiative she founded to ignite middle school students’ interest in math.
“Kimberly is wholly deserving of all of the honors she has received, and I feel so privileged to have had the opportunity to work with her in this early stage of her career,” said Roesner. “I look forward to seeing all of the great things she will do in the future, whether in computer security research or otherwise.”
Some apps highlight when a person is online — and then share that information with their followers. When a user logs in to a website or app that uses online status indicators, a little green (or orange or blue) dot pops up to alert their followers that they’re currently online.
Researchers at the University of Washington wanted to know if people recognize that they are sharing this information and whether these indicators change how people behave online.
UW researchers found that many people misunderstand online status indicators but still carefully shape their behavior to control how they are displayed to others. Camille Cobb
After surveying smartphone users, the team found that many people misunderstand online status indicators but still carefully shape their behavior to control how they are displayed to others. More than half of the participants reported that they had suspected that someone had noticed their status. Meanwhile, over half reported logging on to an app just to check someone else’s status. And 43% of participants discussed changing their settings or behavior because they were trying to avoid one specific person.
These results will be published in the Proceedings of the 2020 ACM CHI conference on Human Factors in Computing Systems.
“Online status indicators are an unusual mechanism for broadcasting information about yourself to other people,” said senior author Alexis Hiniker, an assistant professor in the UW Information School. “When people share information by posting or liking something, the user is in control of that broadcast. But online status indicators are sharing information without taking explicit direction from the user. We believe our results are especially intriguing in light of the coronavirus pandemic: With people’s social lives completely online, what is the role of online status indicators?”
People need to be aware of everything they are sharing about themselves online, the researchers said.
“Practicing good online security and privacy hygiene isn’t just a matter of protecting yourself from skilled technical adversaries,” said lead author Camille Cobb, a postdoctoral researcher at Carnegie Mellon University who completed this research as a UW doctoral student in the Paul G. Allen School of Computer Science & Engineering. “It also includes thinking about how your online presence allows you to craft the identities that you want and manage your interpersonal relationships. There are tools to protect you from malware, but you can’t really download something to protect you from your in-laws.”
The team recruited 200 participants ages 19 to 64 through Amazon Mechanical Turk to fill out an online survey. Over 90% of the participants were from the U.S., and almost half of them had completed a bachelor’s degree.
The researchers asked participants to identify apps that they use from a list of 44 that have online status indicators. The team then asked participants if those apps broadcast their online status to their network. Almost 90% of participants correctly identified that at least one of the apps they used had online status indicators. But for at least one app they used, 62.5% answered “not sure” and 35.5% answered “no.” For example, of the 60 people who said they use Google Docs regularly, 40% said it didn’t have online status indicators and 28% were not sure.
Then the researchers asked the participants to time themselves while they located the settings to turn off “appearing online” in each app they used regularly. For the apps that have settings, participants gave up before they found the settings 28% of the time. For apps that don’t have these settings, such as WhatsApp, participants mistakenly thought they had turned the settings off 23% of the time.
“When you put some of these pieces together, you’re seeing that more than a third of the time, people think they’re not broadcasting information that they actually are,” Cobb said. “And then even when they’re told: ‘Please go try and turn this off,’ they’re still not able to find it more than a quarter of the time. Just broadly we’re seeing that people don’t have a lot of control over whether they share this information with their network.”
Here’s one way the team says designers could help people have more control over whether to broadcast their online status. Cobb et al./ Proceedings of the 2020 ACM CHI conference on Human Factors in Computing Systems
Finally the team asked participants a series of questions about their own experiences online. These questions touched on whether participants noticed when others were online, if they thought others noticed when they were online and whether they had changed their own behavior because they did or didn’t want to appear online.
“We see this repeated pattern of people adjusting themselves to meet the demands of technology — as opposed to technology adapting to us and meeting our needs,” said co-author Lucy Simko, a UW doctoral student in the Allen School. “That means people are choosing to go online not because they want to do something there but because it’s important that their status indicator is projecting the right thing at the right time.”
Now that most states have put stay-at-home orders in place to try to combat the coronavirus pandemic, many people are working from home and socializing only online. This could change how people use online status indicators, the team says. For example, employees can use their online status to indicate that they are working and available for meetings. Or people can use a family member’s “available” status as an opportunity to check up on them and make sure they are OK.
“Right now, when a lot of people are working remotely, I think there’s an opportunity to think about how future evolutions of this technology can help create a sense of community,” Cobb said. “For example, in the real world, you can have your door cracked open and that means ‘interrupt me if you have to,’ you can have it wide open to say ‘come on in’ or you can have your door closed and you theoretically won’t get disturbed. That kind of nuance is not really available in online status indicators. But we need to have a sense of balance — to create community in a way that doesn’t compromise people’s privacy, share people’s statuses when they don’t want to or allow their statuses to be abused.”
Tadayoshi Kohno, a professor in the Allen School, is also a co-author on this paper. This research was funded by the UW Tech Policy Lab.
That statement might hold true for a baseball field in rural Iowa — in the days before social distancing, that is — but what about when it comes to building mobile technologies to fight a global pandemic?
In the balance between individual civil liberties and the common good, there is an obvious tension between the urge to deploy the latest, greatest tools for tracking the spread of COVID-19 and the preservation of personal privacy. But according to a team of researchers and technologists affiliated with the Paul G. Allen School of Computer Science & Engineering, UW Medicine and Microsoft, there is a way to build technology that respects the individual and their civil liberties while supporting public health objectives and saving people’s lives.
In a white paper released yesterday, the team proposes a comprehensive set of principles to guide the development of mobile tools for contact tracing and population-level disease tracking while mitigating security and privacy risks. The researchers refer to these principles as PACT, short for “Privacy Sensitive Protocols and Mechanisms for Mobile Contact Tracing.”
“Contact tracing is one of the most effective tools that public health officials have to halt a pandemic and prevent future breakouts,” explained professor Sham Kakade, who holds a joint appointment in the Allen School and the UW Department of Statistics. “The protocols in PACT are specified in a transparent manner so the tradeoffs can be scrutinized by academia, industry, and civil liberties organizations. PACT permits a more frank evaluation of the underlying privacy, security, and re-identification issues, rather than sweeping these issues under the rug.”
If people were not familiar with the concept of contact tracing before, they surely are now with the outbreak of COVID-19. Public health officials have been relying heavily on the process to identify individuals who may have been exposed through proximity to an infected person to try and halt further spread of the disease. Several governments and organizations have deployed technology to assist with their response; depending on the situation, participation may be voluntary or involuntary. Whether optional or not, the increased use of technology to monitor citizens’ movements and identify other people with whom they meet has rightly sparked concerns around mass surveillance and a loss of personal privacy.
The cornerstone of the PACT framework put forward by the UW researchers is a third-party free approach, which Kakade and his colleagues argue is preferable to a “trusted third party” (TTP) model such as that used for apps administered by government agencies. Under PACT, strict user privacy and anonymity standards stem from a decentralized approach to data storage and collection. The typical TTP model, on the other hand, involves a centralized registration process wherein users subscribe to a service. While this can be a straightforward approach and is one that will be very familiar to users, it also centrally aggregates personally sensitive information that could potentially be accessed by malicious actors. This aggregation also grants the party in question — in this case, a government agency — the ability to identify individual users and to engage in mass surveillance.
The team’s white paper lays out in detail how mobile technologies combined with a third-party free approach can be used to improve the speed, accuracy, and outcomes of contact tracing while mitigating privacy concerns and preserving civil liberties. These include the outline of an app for conducting “privacy-sensitive” mobile contact tracing that relies on Bluetooth-based proximity detection to identify instances of co-location — that is, instances of two phones in proximity, via their pseudonyms — to determine who may be at risk. The team prefers co-location to absolute location information because it is more accurate than current GPS localization technologies, such as those in popular mapping and navigation apps, while affording more robust privacy protections to the user. Depending on the nature of the specific app, such a system could be useful in allowing people who test positive for the disease to securely broadcast information under a pseudonym to other app users who were in close proximity to them, without having to reveal their identity or that of the recipients.
Another example of how PACT can aid in the pandemic response include mobile-assisted contact tracing interviews. In this scenario, a person who tests positive completes a form on their smartphone listing their contacts in advance of the interview; the data remains on the person’s device until they choose to share it with public health officials. The team also describes a system for enabling narrowcast messages, which are public service messages pushed out from a government agency to a subset of the citizenry. Such communications might be used to inform people living in a specific area of local facility closures due to an outbreak, or to notify them in the event that they were at a location during the same time frame as a person who subsequently tested positive for the disease.
In all cases, the researchers advocate for retaining data locally on the person’s device until they initiate a transfer.
“Only with appropriate disclosures and voluntary action on the part of the user should their data be uploaded to external servers or shared with others — and even then, only in an anonymized fashion,” explained Allen School professor Shyam Gollakota. “We consider it a best practice to have complete transparency around how and where such data is used, as well as full disclosure of the risks of re-identification from previously anonymized information once it is shared.”
Gollakota and his colleagues emphasize that technology-enabled contact tracing can only augment — not entirely replace — conventional contact tracing. In fact, two out of the three applications they describe are designed to support the latter and were developed with input from public health organizations and from co-author Dr. Jacob Sunshine of UW Medicine. There is also the simple fact that, despite their seeming ubiquity, not everyone has a smartphone; of those who do, not everyone would opt to install and use a contact-tracing app.
As Allen School professor and cryptography expert Stefano Tessaro notes, all contact tracing — whether conventional or augmented with technology — involves tradeoffs between privacy and the public good.
“Contact tracing already requires a person to give up some measure of personal privacy, as well as the privacy of those they came into contact with,” Tessaro pointed out. “However, we can make acceptable tradeoffs to enable us to use the best tools available to speed up and improve that process, while ensuring at the same time meaningful privacy guarantees, as long as the people creating and implementing those tools adhere to the PACT.”