Security Lab researchers reveal how smart devices can be turned into surveillance devices with music

(Cross-posted from Allen School News.)

CovertBand demo

Researchers from the Allen School’s Networks & Mobile Systems Lab and Security and Privacy Research Lab teamed up on a new project, CovertBand, to demonstrate how smart devices can be converted into surveillance tools capable of secretly tracking the body movements and activities of users and their companions. CovertBand turns off-the-shelf devices into active sonar systems with the help of acoustic pulses concealed in music. The team’s findings reveal how increasingly popular smart home assistants and other connected devices could be used to compromise users’ privacy in their own homes — even from half a world away.

“Most of today’s smart devices including smart TVs, Google Home, Amazon Echo and smartphones come with built-in microphones and speaker systems — which lets us use them to play music, record video and audio tracks, have phone conversations or participate in videoconferencing,” Allen School Ph.D. student and co-lead author Rajalakshmi Nandakumar told UW News. “But that also means that these devices have the basic components in place to make them vulnerable to attack.”

As fellow author and Ph.D. student Alex Takakuwa points out, “Other surveillance approaches require specialized hardware. CovertBand shows for the first time that through-barrier surveillance is possible using no hardware beyond what smart devices already have.”

CovertBand relies on repetitive acoustic pulses in the range of 18 to 20 kHz. While that is typically low enough that most adults are unlikely to pick up on the signals, young people and pets might — and an audible volume is required for more distant surveillance or to pick up activity through walls. To get around this, the team found that they could disguise the pulses under a layer of music, with repetitive, percussive beats the most effective at hiding the additional sound.

Tadayoshi Kohno, Rajalakshmi Nandakumar, Shyam Gollakota

Left to right: Tadayoshi Kohno, Rajalakshmi Nandakumar, and Shyam Gollakota (Not pictured: Alex Takakuwa)

“To our knowledge, this is the first time anyone has demonstrated that it is possible to convert smart commodity devices into active sonar systems using music,” said Allen School professor and co-author Shyam Gollakota.

By connecting a smartphone to a portable speaker or flat-screen TV, the researchers discovered they could use the data collected through CovertBand to accurately identify repetitive movements such as walking, jumping, and exercising up to a distance of six meters within line of sight, and up to three meters through walls. Having proven the concept, researchers believe a combination of more data and the use of machine learning tools would enable rapid classification of a greater variety of movements — and perhaps enable the identification of the individual making them.

With CovertBand, Allen School researchers have identified a plausible threat, given the increasing ubiquity of these devices in our pockets and in our living rooms. But our embrace of emerging technologies needn’t end on a sour note. As professor and co-author Tadayoshi Kohno points out, when it comes to cybersecurity, knowledge is power.

“We’re providing education about what is possible and what capabilities the general public might not know about, so that people can be aware and can build defenses against this,” he said.

The researchers will present a paper detailing their findings at the Ubicomp 2017 conference in Maui, Hawaii next month.

Read the full UW News release here. Learn more and listen to samples of the CovertBand attack music on the project web page here. Check out articles on CovertBand in Fast Company, Digital Trends, New Atlas, and The Register.

Security Lab researchers expose cybersecurity risks of DNA sequencing software

(Cross-posted from Allen School News.)

Lee Organick, Karl Koscher, and Peter Ney prepare the DNA exploit

Left to right: Lee Organick, Karl Koscher, and Peter Ney prepare the DNA exploit.

In an illustration of just how narrow the divide between the biological and digital worlds has become, a team of researchers from the Allen School released a study revealing potential security risks in software commonly used for DNA sequencing and analysis — and demonstrated for the first time that it is possible to infect software systems with malware delivered via DNA molecules. The team will present its paper, “Computer Security, Privacy, and DNA Sequencing: Compromising Computers with Synthesized DNA, Privacy Leaks, and More,” at the USENIX Security Symposium in Vancouver, British Columbia next week.

Many open-source systems used in DNA analysis began in the cloistered domain of the research lab. As the cost of DNA sequencing has plummeted, new medical and consumer-oriented services have taken advantage, leading to more widespread use — and with it, potential for abuse. While there is no evidence to indicate that DNA sequencing software is at imminent risk, the researchers say now would be a good time to address potential vulnerabilities.

“One of the big things we try to do in the computer security community is to avoid a situation where we say, ‘Oh shoot, adversaries are here and knocking on our door and we’re not prepared,’” said professor Tadayoshi Kohno, co-director of the Security and Privacy Research Lab, in a UW News release.

Tabloid headline: "Computer Virus Spreads to Humans!"

Researcher Tadayoshi Kohno wondered if what this tabloid headline suggested would work in reverse: Could DNA be used to deliver a virus to a computer?

Kohno and Karl Koscher (Ph.D., ’14), who works with Kohno in the Security and Privacy Research Lab, have been down this road before — literally as well as figuratively. In 2010, they and a group of fellow UW and University of California, San Diego security researchers demonstrated that it was possible to hack into modern automobile systems connected to the internet. They have also explored potential security vulnerabilities in implantable medical devices and household robots.

Kohno conceived of this latest experiment after he came across an online discussion about a tabloid headline in which a person was alleged to have been infected by a computer virus. While he wasn’t about to take that fantastical storyline at face value, Kohno was curious whether the concept might work in reverse.

Kohno, Koscher, and Allen School Ph.D. student Peter Ney — representing the cybersecurity side of the equation — teamed up with professor Luis Ceze and research scientist Lee Organick of the Molecular Information Systems Lab, where they are working on an unrelated project to create a DNA-based storage solution for digital data. The group decided not only would they analyze existing software for vulnerabilities; they would attempt to exploit them.

“We wondered whether under semi-realistic circumstances it would be possible to use biological molecules to infect a computer through normal DNA processing,” Ney said.

As it turns out, it is possible. The team introduced a known vulnerability into software they would then use to analyze the DNA sequence. They encoded a malicious exploit within strands of synthetic DNA, and then processed those strands using the compromised software. When they did, the researchers were able to execute the encoded malware to gain control of the computer on which the sample was being analyzed.

While there are a number of physical and technical challenges someone would have to overcome to replicate the experiment in the wild, it nevertheless should serve as a wake-up call for an industry that has not yet had to contend with significant cybersecurity threats. According to Koscher, there are steps companies and labs can immediately take to improve the security of their DNA sequencing software and practice good “security hygiene.”

Onscreen output of DNA sequencing machine,

This output from a DNA sequencing machine includes the team’s exploit.

“There is some really low-hanging fruit out there that people could address just by running standard software analysis tools that will point out security problems and recommend fixes,” he suggested. For the longer term, the group’s recommendations include employing adversarial thinking in setting up new processes, verifying the source of DNA samples prior to processing, and developing the means to detect malicious code in DNA.

The team emphasized that people who use DNA sequencing services should not worry about the security of their personal genetic and medical information — at least, not yet. “Even if someone wanted to do this maliciously, it might not work,” Organick told UW News.

While Ceze admits he is concerned by what the team discovered during their analysis, it is a concern that is largely rooted in conjecture at this point.

“We don’t want to alarm people,” Ceze pointed out. “We do want to give people a heads up that as these molecular and electronic worlds get closer together, there are potential interactions that we haven’t really had to contemplate before.”

Visit the project website and read the UW News release to learn more.

Also see coverage in Wired, The Wall Street Journal, MIT Technology Review, The Atlantic, TechCrunch, Mashable, Gizmodo, ZDNet, GeekWire, Inverse, IEEE Spectrum, and TechRepublic.

Eric Zeng at SOUPS 2017

Today Security Lab PhD student Eric Zeng presented at the 13th Symposium on Usable Privacy and Security (SOUPS) in Santa Clara, CA. Eric presented his paper “End User Security & Privacy Concerns with Smart Homes” for which he conducted interviews with fifteen people living in smart homes (twelve smart home administrators and three other residents) to learn about how they use their smart homes, and to understand their security and privacy related attitudes, expectations, and actions. This work was done in collaboration with Security Lab postdoc Shrirang Mare and faculty member Franzi Roesner. Read the full research paper here. Congrats Eric on a great presentation!

Welcome Earlence!

The Security and Privacy Lab is excited to welcome postdoctoral researcher Earlence Fernandes. Earlence’s research focuses on computer security and privacy, with a special interest in cyber-physical systems. His recent work, a security analysis of the popular SmartThings platform, received the Distinguished Practical Paper Award at 2016 IEEE Security and Privacy. Earlence joins us after earning his PhD at the University of Michigan in 2017, where he was advised by Prof. Atul Prakash. Welcome Earlence!

Graduation 2017

Congratulations to Dr. Ada Lerner and Dr. Paul Vines on their PhD graduations today! Prof. Dr. Lerner will soon be joining Wellesley College as an assistant professor and Dr. Vines will be joining BAE Systems as a Principal Research Engineer. Congratulations to all the UW Allen School graduates!

Security Lab researchers shed light on secret surveillance with SeaGlass

(Cross-posted from Allen School News.)

Peter Ney and Ian Smith

Researchers Peter Ney (left) and Ian Smith

A team of researchers in the Allen School’s Privacy & Security Research Lab have developed a new system, SeaGlass, which could bring more transparency and accountability to cell-phone surveillance. SeaGlass is capable of detecting anomalies in the cellular network that may indicate the presence of surveillance devices called IMSI-catchers (also known as cell-site simulators or Stingrays), which track individuals through their International Mobile Subscriber Identity by posing as a legitimate cell tower.

WIRED magazine recently talked to the researchers about SeaGlass and their hope that the findings contribute to the public discourse. From the article:

“Law enforcement’s use of the surveillance devices known as stingrays, fake cell towers that can intercept communications and track phones, remains as murky as it is controversial, hidden in non-disclosure agreements and cloak-and-dagger secrecy. But a group of Seattle researchers has found a new method to track those trackers: by recruiting ridesharing vehicles as surveillance devices of their own.

“For two months last year, researchers at the University of Washington paid drivers of an unidentified ridesharing service to keep custom-made sensors in the trunks of their cars, converting those vehicles into mobile cellular data collectors.”

GIF of SeaGlass

Data collected from a single cell tower in Seattle over two months

The article explains how the information gathered by those mobile data collectors was analyzed to identify irregularities in the network compared to normal cell tower behavior, which could indicate that a surveillance device is active in the area. By using off-the-shelf parts and deploying the system on fleet vehicles that cover a lot of ground during their normal course of business, the researchers demonstrated SeaGlass to be an inexpensive and unobtrusive way to map the cellular landscape.

During the pilot, the system detected three anomalies in the greater Seattle area that piqued the team’s interest. Although the results do not offer conclusive proof of an IMSI-catcher, they do provide a good starting point for further exploration — and for an overdue conversation about surveillance practices.

“Up until now the use of IMSI-catchers around the world has been shrouded in mystery, and this lack of concrete information is a barrier to informed public discussion,” Ph.D. student Peter Ney said in a UW News release. “Having additional, independent and credible sources of information on cell-site simulators is critical to understanding how — and how responsibly — they are being used.”

The team, which also includes Allen School research scientist Ian Smith, Ph.D. student Gabriel Cadamuro, and professor Tadayoshi Kohno, hopes that its work will contribute to a more robust public debate over cell-phone surveillance. They describe in detail the SeaGlass technology and results of the pilot in a paper that will be published this month in the Proceedings on Privacy Enhancing Technologies.

“SeaGlass is a promising technology that — with wider deployment — can be used to help empower citizens and communities to monitor this type of surveillance,” said Smith. “This issue is bigger than one team of researchers. We’re eager to push this out into the community and find partners who can crowdsource more data collection and begin to connect the dots in meaningful ways.”

Read the WIRED story here, the UW News release here, and additional coverage by TechCrunch, Gizmodo, and Engadget. Learn more at the SeaGlass website here.

Eric’s Quals Talk

Security Lab PhD student Eric Zeng gave a great Quals Talk today at the Allen School, describing his work studying end user security and privacy concerns with smart homes. The Quals project and talk — along with all the relevant coursework — fulfill a major milestone along the path to a PhD. Eric will be presenting a similar talk at the Symposium on Usable Privacy and Security (SOUPS) in July. Congratulations Eric!

Kiron Lebeck at Oakland 2017

Today Security Lab PhD student Kiron Lebeck presented at the 38th IEEE Symposium on Security & Privacy (Oakland) in San Jose, CA. Kiron presented his paper “Security Augmented Reality Output”, describing a design for an augmented reality platform that mitigates risks from buggy or malicious application output (e.g., virtual content that obscures a user’s view of important physical world objects, like oncoming cars, or that startles a user). This work was done in collaboration with Security Lab undergraduate researcher Kimberly Ruth and faculty members Yoshi Kohno and Franzi Roesner. Congrats Kiron on a great presentation!

Read the full research paper here and learn more about the UW Security Lab’s efforts on security and privacy for emerging augmented reality platforms here. This work was also recently covered by Science.

1 10 11 12 13 14 29