Franzi On KUOW’s “Primed” About Smart Homes

Security and Privacy Lab co-director Professor Franzi Roesner was interviewed on KUOW’s “Primed” Podcast about how smart home technologies can exacerbate existing power dynamics or tensions among home occupants or visitors. Listen to the interview here. Read more about the Security Lab’s work on this topic in several papers:

Uncle Phil, is that really you? Allen School researchers decode vulnerabilities in online genetic genealogy services

(Cross-posted from Allen School News.)

Hand holding saliva collection tube
Marco Verch/Flickr

Genetic genealogy websites enable people to upload their results from consumer DNA testing services like Ancestry.com and 23andMe to explore their genetic makeup, familial relationships, and even discover new relatives they didn’t know they had. But how can you be sure that the person who emails you claiming to be your Uncle Phil really is a long-lost relation?

Based on what a team of Allen School researchers discovered when interacting with the largest third-party genetic genealogy service, you may want to approach plans for a reunion with caution. In their paper “Genotype Extraction and False Relative Attacks: Security Risks to Third-Party Genetic Genealogy Services Beyond Identity Inference,” they analyze how security vulnerabilities built into the GEDmatch website could allow someone to construct an imaginary relative or obtain sensitive information about people who have uploaded their personal genetic data. 

Through a series of highly-controlled experiments using information from the GEDmatch online database, Allen School alumnus and current postdoctoral researcher Peter Ney (Ph.D., ‘19) and professors Tadayoshi Kohno and Luis Ceze determined that it would be relatively straightforward for an adversary to exploit vulnerabilities in the site’s application programming interface (API) that compromise users’ privacy and expose them to potential fraud. The team demonstrated multiple ways in which they could extract highly personal, potentially sensitive genetic information about individuals on the site — and use existing familial relationships to create false new ones by uploading fake profiles that indicate a genetic match where none exists.

Part of GEDmatch’s attraction is its user-friendly graphical interface, which relies on bars and color-coding to visualize specific genetic markers and similarities between two profiles. For example, the “chromosome paintings” illustrate the differences between two profiles on each chromosome, accompanied by “segment coordinates” that indicate the precise genetic markers that the profiles share. These one-to-one comparisons, however, can be used to reveal more information than intended. It was this aspect of the service that the researchers were able to exploit in their attacks. To their surprise, they were not only able to determine the presence or absence of various genetic markers at certain segments of a hypothetical user’s profile, but to reconstruct 92% of the entire profile with 98% accuracy.

As a first step, Ney and his colleagues created a research account on GEDmatch, to which they uploaded artificial genetic profiles generated from data contained in anonymous profiles from multiple, publicly available datasets designated for research use. By assigning each of their profiles a privacy setting of “research,” the team ensured that their artificial profiles would not appear in public matching results. Once the profiles were uploaded, GEDmatch automatically assigned each one a unique ID, which enabled the team to perform comparisons between a specific profile and others in the database — in this case, a set of “extraction profiles” created for this purpose. The team then performed a series of experiments. For the total profile reconstruction, they uploaded and ran comparisons between 20 extraction profiles and five targets. Based on the GEDmatch visualizations alone, they were able to recover just over 60% of the target profiles’ data. Based on their knowledge of genetics, specifically the frequency with which possible DNA bases are found within the population at a specific position on the genome, they were able to determine another 30%. They then relied on a genetic technique known as imputation to fill in the rest. 

Once they had constructed nearly the whole of a target’s profile, the researchers used that information to create a false child for one of their targets. When they ran the comparison between the target profile and the false child profile through the system, GEDmatch confirmed that the two were a match for a parent-child relationship.

While it is true that an adversary would have to have the right combination of programming skills and knowledge of genetics and genealogy to pull it off, the process isn’t as difficult as it sounds — or, to a security expert, as it should be. To acquire a person’s entire profile, Ney and his colleagues performed the comparisons between extraction and target profiles manually. They estimate the process took 10 minutes to complete — a daunting prospect, perhaps, if an adversary wanted to compare a much greater number of targets. But if one were to write a script that automatically performs the comparisons? “That would take 10 seconds,” said Ney, who is the lead author of the paper.

Consumer-facing genetic testing and genetic genealogy are still relatively nascent industries, but they are gaining in popularity. And as the size of the database grows, so does the interest of law enforcement looking to crack criminal cases for which the trail has gone cold. In one high-profile example from last year, investigators arrested a suspect alleged to be the Golden State Killer, whose identity remained elusive for more than four decades before genetic genealogy yielded a breakthrough. Given the prospect of using genetic information for this and other purposes, the researchers’ findings yield important questions about how to ensure the security and integrity of genetic genealogy results, now and into the future.

“We’re only beginning to scratch the surface,” said Kohno, who co-directs the Allen School’s Security and Privacy Research Lab and previously helped expose potential security vulnerabilities in internet-connected motor vehicles, wireless medical implants, consumer robotics, mobile advertising, and more. “The responsible thing for us is to disclose our findings so that we can engage a community of scientists and policymakers in a discussion about how to mitigate this issue.”

Echoing Kohno’s concern, Ceze emphasizes that the issue is made all the more urgent by the sensitive nature of the data that people upload to a site like GEDmatch — with broad legal, medical, and psychological ramifications — in the midst of what he refers to as “the age of oversharing information.”

“Genetic information correlates to medical conditions and potentially other deeply personal traits,” noted Ceze, who co-directs the Molecular Information Systems Laboratory at the University of Washington and specializes in computer architecture research as a member of the Allen School’s Sampa and SAMPL groups. “As more genetic information goes digital, the risks increase.”

Unfortunately for those who are not prone to oversharing, the risks extend beyond the direct users of genetic genealogy services. According to Ney, GEDmatch contains the personal genetic information of a sufficient number and variety of people across the U.S. that, should someone gain illicit possession of the entire database, they could potentially link genetic information with identity for a large portion of the country. While Ney describes the decision to share one’s data on GEDmatch as a personal one, some decisions appear to be more personal — and wider reaching — than others. And once a person’s genetic data is compromised, he notes, it is compromised forever. 

So whether or not you’ve uploaded your genetic information to GEDmatch, you might want to ask Uncle Phil for an additional form of identification before rushing to make up the guest bed. 

“People think of genetic data as being personal — and it is. It’s literally part of their physical identity,” Ney said. “You can change your credit card number, but you can’t change your DNA.”

The team will present its findings at the Network and Distributed System Security Symposium (NDSS 2020) in San Diego, California in February.

To learn more, read the UW News release here and an FAQ on security and privacy issues associated with genetic genealogy services here. Also check out related coverage by MIT Technology Review, OneZero, ZDNet, GeekWire, McClatchy, and Newsweek.

Summertime Celebration

This has been a very productive and busy summer for the UW Allen School Security and Privacy Research Lab! To celebrate the end of summer, the lab ventured on an outing to “Molly Moon’s Homemade Ice Cream”, a short walk from our building. It was a beautiful day, and great ice cream! 🙂

Summer Project Presentation: Henry Bowman

Visiting Cal Poly undergraduate Henry Bowman presented his summer project final presentation at today’s Security Lab meeting, before returning to Cal Poly to finish his Bachelors Degree.

Henry’s work focused on problems related to augmented reality, computer security, and privacy. As part of his summer project, Henry contributed to the Security Lab’s ShareAR project. ShareAR, or the Secure and Private AR Sharing Toolkit, is a project developed by Security Lab member Kimberly Ruth with faculty members Franzi and Yoshi and that enables the secure and private sharing of holographic HoloLens objects with others users. Allen School undergraduate student AJ Kruse also contributed to the project this summer. To learn more about the project, see Kimberly’s 2019 USENIX Security paper and talk.

Great job Henry, and great talk!

New tools to minimize risks in shared, augmented-reality environments

(Cross-posted from UW News, by Sarah McQuate)

A person holding up an iPad that shows a digital world over the real world.

For now, augmented reality remains mostly a solo activity, but soon people might be using the technology in groups for collaborating on work or creative projects.

A few summers ago throngs of people began using the Pokemon Go app, the first mass-market augmented reality game, to collect virtual creatures hiding in the physical world.

For now, AR remains mostly a solo activity, but soon people might be using the technology for a variety of group activities, such as playing multi-user games or collaborating on work or creative projects. But how can developers guard against bad actors who try to hijack these experiences, and prevent privacy breaches in environments that span digital and physical space?

University of Washington security researchers have developed ShareAR, a toolkit that lets app developers build in collaborative and interactive features without sacrificing their users’ privacy and security. The researchers presented their findings Aug. 14 at the USENIX Security Symposium in Santa Clara, California.

“A key role for computer security and privacy research is to anticipate and address future risks in emerging technologies,” said co-author Franziska Roesner, an assistant professor in the Paul G. Allen School of Computer Science & Engineering. “It is becoming clear that multi-user AR has a lot of potential, but there has not been a systematic approach to addressing the possible security and privacy issues that will arise.”

Sharing virtual objects in AR is in some ways like sharing files on a cloud-based platform like Google Drive — but there’s a big difference.

“AR content isn’t confined to a screen like a Google Doc is. It’s embedded into the physical world you see around you,” said first author Kimberly Ruth, a UW undergraduate student in the Allen School. “That means there are security and privacy considerations that are unique to AR.”

For example, people could potentially add virtual inappropriate images to physical public parks, scrawl virtual offensive messages on places of worship or even place a virtual “kick me” sign on an unsuspecting user’s back.

“We wanted to think about how the technology should respond when a person tries to harass or spy on others, or tries to steal or vandalize other users’ AR content,” Ruth said. “But we also don’t want to shut down the positive aspects of being able to share content using AR technologies, and we don’t want to force developers to choose between functionality and security.”

To address these concerns, the team created a prototype toolkit, ShareAR, for the Microsoft HoloLens. ShareAR helps applications create, share and keep track of objects that users share with each other.

Another potential issue with multi-user AR is that developers need a way to signal the physical location of someone’s private virtual content to keep other users from accidentally standing in between that person and their work — like standing between someone and the TV. So the team developed “ghost objects” for ShareAR.

“A ghost object serves as a placeholder for another virtual object. It has the same physical location and rough 3D bulk as the object it stands in for, but it doesn’t show any of the sensitive information that the original object contains,” Ruth said. “The benefit of this approach over putting up a virtual wall is that, if I’m interacting with a virtual private messaging window, another person in the room can’t sneak up behind me and peer over my shoulder to see what I’m typing — they always see the same placeholder from any angle.”

The team tested ShareAR with three case study apps. Creating objects and changing permission settings within the apps were the most computationally expensive actions. But, even when the researchers tried to stress out the system with large numbers of users and shared objects, ShareAR took no longer than 5 milliseconds to complete a task. In most cases, it took less than 1 millisecond.

Three example case study apps, one showing virtual blocks over a living room, one showing virtual notes over the living room and one showing red paintballs over the living room.

The team tested ShareAR with three case study apps: Cubist Art (top panel), which lets users create and share virtual artwork with each other; Doc Edit (bottom left panel), which lets users create virtual notes or lists they can share or keep private; and Paintball (bottom right panel), which lets users play paintball with virtual paint. In the Doc Edit app, the semi-transparent gray box in the top left corner represents a “ghost object,” or a document that another user wishes to remain private.Ruth et al./USENIX Security Symposium

Developers can now download ShareAR to use for their own HoloLens apps.

“We’ll be very interested in hearing feedback from developers on what’s working well for them and what they’d like to see improved,” Ruth said. “We believe that engaging with technology builders while AR is still in development is the key to tackling these security and privacy challenges before they become widespread.”

Tadayoshi Kohno, a professor in the Allen School, is also a co-author on this paper. This research was funded by the National Science Foundation and the Washington Research Foundation.

Learn more about the UW Security & Privacy Lab and its role in the space of computer security and privacy for augmented reality.

###

For more information, contact Roesner at franzi@cs.washington.edu, Ruth at kcr32@cs.washington.edu and Kohno at yoshi@cs.washington.edu.

Grant numbers: CNS-1513584, CNS-1565252, CNS-1651230

Security Lab at USENIX Security 2019

The UW Security and Privacy Lab, and the lab’s friends and alumni, were out in force at USENIX Security 2019. On Wednesday, current UW Security and Privacy Lab members presented three papers in the same session:

Below are photos from each of the above talks, as well as from UW Systems Lab alumnus Charlie Reis‘s talk’s (with alumnus Alex Moshchuk) on “Site Isolation: Process Separation for Web Sites within the Browser” and Ivan Evtimov‘s talk on a new smarthome security lab (URL forthcoming).

Distinguished Paper Award @ USENIX Security 2019

Congratulations to UW Security and Privacy lab member Christine Chen, advised by Prof. Franzi Roesner, and collaborator (and UW alumnae) Nicki Dell for winning a Distinguished Paper Award at USENIX Security 2019! USENIX Security is one of the top peer-reviewed conferences in computer security, and this is an incredible honor. The authors are also extremely grateful to the people who participated in their study and for the opportunity to share those voices with the computer security and privacy community.

Read their paper on “Computer Security and Privacy in the Interactions Between Victim Service Providers and Human Trafficking Survivors” here.

1 4 5 6 7 8 28