Securing the Fourth Estate: What the Panama Papers and Confidante reveal about journalists’ needs and practices

(Cross-posted from Allen School News.)

Reporters with laptops sitting around boardroom table

Reporters contributing to the Panama Papers investigation meet in Munich, Germany to receive training on ICIJ’s research tools. Photo credit: Kristof Clerix

When the Panama Papers story first broke in April 2016, its explosive revelations of a vast and hidden network of offshore shell companies and financial scandals-in-waiting tied to politicians, corporations, banking institutions, and organized crime represented a victory for good, old-fashioned investigative journalism — with a high tech twist. In addition to provoking international outrage, toppling governments, and instigating audits and investigations in more than 70 countries, the story caught the eye of researchers like Allen School professor Franziska Roesner, who — working with a team of researchers from the University of Washington’s Security and Privacy Research Lab and collaborators at Columbia University and Clemson University — has made a study of the security practices of journalists and developed new solutions tailored to the needs of the Fourth Estate.

While the users of secure systems can notoriously be the weakest link, what Roesner and colleagues found in examining the successful Panama Papers investigation was that the users — in this case, the more than 300 reporters spread across six continents working under the auspices of the International Consortium of Investigative Journalists — were, in fact, a source of strength.

“Success stories in computer security are rare,” noted Roesner. “But we discovered that the journalists involved in the Panama Papers project seem to have achieved their security goals.”

The researchers set out to determine how hundreds of journalists with varying degrees of technical acumen were able to securely collaborate on the year-long investigation, which involved 11.5 million leaked documents from Panama-based law firm Mossack Fonseca that implicated individuals and entities at the highest reaches of power. They relied on a combination of survey data from 118 journalists who participated in the investigation, and in-depth, semi-structured interviews with those who designed and implemented the security systems that facilitated global collaboration while protecting those doing the collaborating. The team presented its findings in their paper, “When the Weakest Link Is Strong: Secure Collaboration in the Case of the Panama Papers,” as part of the 26th USENIX Security Symposium in Vancouver, Canada last month.

Franziska Roesner

Allen School professor Franziska Roesner has made a study of journalists’ security needs and practices

Roesner and her colleagues were surprised to discover the extent to which ICIJ was able to strictly and consistently enforce security requirements such as PGP and two-factor authentication — even among those for whom such tools and practices were new. One of the main reasons the operation was a success, the researchers found, came down to utility.

“We found that the tools developed for the project were highly useful and usable, which motivated journalists to use the secure communication platforms provided by the ICIJ,” explained Susan McGregor, a professor at Columbia Journalism School and a principal investigator, along with Kelly Caine of Clemson University’s School of Computing, on the study.

They also found that journalists were motivated by more than sheer usefulness: their sense of community, and responsibility to that community, spurred them to not only tolerate but to embrace the strict security requirements put in place.

“The project leaders frequently communicated the importance of security and mutual trust,” Roesner noted. “This cultivated a strong sense of shared responsibility for the security of not only themselves, but of their colleagues — they were all in this together, and that was a powerful factor in the success of the operation, from a security standpoint.”

It also helped that the ICIJ walked their talk: if a journalist did not have access to a cellphone that could serve as a second factor, the organization purchased and configured one for them. They also made PGP a default tool and ensured everyone had a PGP key, thus taking the guesswork out of evaluating and selecting appropriate tools for themselves.

ICIJ’s approach helped it to avoid a number of known pitfalls when it comes to journalists’ security. Earlier work by Roesner and her collaborators that examined the security and privacy needs and constraints of journalists as well as those of the media organizations that employ them revealed the inadequacy of current tools, which often impede the gathering of information. The researchers found that this often led journalists to create ad-hoc workarounds that may compromise their own security and the security of their sources.

Armed with the lessons learned from those previous studies, Roesner teamed up with Allen School Ph.D. students Ada Lerner (now a faculty member at Wellesley College) and Eric Zeng, and undergraduate student Mitali Palekar to develop Confidante, a usable encrypted email client for journalists and others who require secure electronic communication that aims to improve on traditional PGP tools like those used in the Panama Papers investigation.

“We built Confidante to explore how we could combine strong security with ease of use and minimal configuration. One of our goals was for it to feel, as much as possible, like using regular email,” explained Lerner.

Ada Lerner, Mitali Palekar, Eric Zeng, and Confidante logo

Confidante team members, clockwise from top left: Ada Lerner, Mitali Palekar, and Eric Zeng

“Building it allowed us to get really specific with journalists in our user study, since it was a prototype they could try out and react to — and that allowed us to ask them about the ways in which it did and didn’t meet their needs,” she continued. “It let us more concretely understand what kind of system might be able to provide journalists with strong protections, including reducing user errors that might inadvertently compromise their security.”

Confidante is built on top of Gmail to send and receive messages and Keybase for automatic public/private key management. In a study of a working prototype involving journalists and lawyers, the team found that Confidante enabled users to complete an encrypted email task more quickly, and with fewer errors, compared to an existing email encryption tool. Compatibility with mobile was another factor that met with users’ approval.

“Every journalist and lawyer involved in our user study regularly reads and responds to email on the go, so any encrypted email solution developed for this group must work on mobile devices,” noted Zeng. “As a standalone email app built with modern web technologies, Confidante meets this need, whereas integrated PGP tools like browser extensions do not.”

Some participants observed that using Confidante, with its automated key management, was not that different from sending regular email — suggesting that Roesner and her colleagues had hit the mark when it comes to balancing user preferences and strong security.

“Tools fail in part when the technical community has built the wrong thing, so it’s important for us as computer security researchers to understand user needs and constraints,” observed Roesner. “What the Panama Papers study and Confidante illustrate is that there are ways to help journalists to do their jobs securely as well as effectively — and this is important not just for these individuals and their sources, but for society at large.”

Read the USENIX Security paper to learn more about computer security and the Panama Papers. Visit the Confidante website to try out the prototype and view the publicly available source code from the Allen School research team.

Franziska Roesner recognized with TR35 Award

(Cross-posted from Allen School News.)

Franziska RoesnerAllen School professor Franziska Roesner has been recognized with a 2017 TR35 Award, MIT Technology Review’s annual celebration of the world’s 35 top innovators under the age of 35. Roesner is honored in the “Inventors” category, recognizing the visionary individuals who are creating the breakthroughs and building the technologies that will shape the future.

Roesner co-directs the Allen School’s Security and Privacy Research Lab, where she analyzes the security and privacy risks of existing and emerging technologies and develops tools to safeguard end users. She is also a member of the University of Washington’s interdisciplinary Tech Policy Lab.

She is the first computer scientist to analyze the risks associated with augmented reality (AR) technologies in order to support the design of systems that mitigate vulnerabilities in these emerging platforms. These technologies are becoming increasingly popular, not only for entertainment but also for assistive purposes, such as heads-up windshield displays in cars. When Roesner began studying them in 2011, products such as Google Glass had not been announced yet and such technologies were still largely in the realm of science fiction. Roesner’s research covers issues associated with both inputs and outputs, from the potentially sensitive sensor data these platforms collect on users in the course of their interactions, to the impact of visual ad content on the safety of users and bystanders. Her impact in AR and virtual reality (VR) extends beyond the lab: her research has made her a go-to source for other researchers, government regulators, and industry leaders on how to counter the privacy, security, and safety risks in order to realize the full potential of these emerging technologies.

Web privacy and security is another area in which Roesner has produced pioneering research that has had a lasting impact on users. In 2011, when web tracking was a nascent concern, she produced the first comprehensive measurement of third-party tracking on the web. More recently, her team studied the evolution of tracking methods over a 20-year period, from 1996 to 2016 using a novel tool called Tracking Excavator. Roesner previously built a new anti-tracking tool, ShareMeNot, whose code was incorporated into the Electronic Frontier Foundation’s PrivacyBadger browser add-on. PrivacyBadger and other add-ons that incorporated ShareMeNot’s ideas are used by millions of people to safeguard their privacy online.

Another user group that has benefitted from Roesner’s user-centric research is journalists and others who rely on secure communication with sources, clients, and colleagues. After hearing stories like how it took reporter Glenn Greenwald months to establish a secure email connection with source Edward Snowden, she collaborated with experts from the journalism community on a study of the computer security needs of journalists and lawyers. Based on those findings, Roesner spearheaded the development of Confidante, a usable encrypted email client that offers the security of traditional encryption technologies without the friction of traditional key management and verification.

“Ideally, we’d like to design and build security and privacy tools that actually work for end users. But to do that, we need to engage with those users, to understand what they need, and not build technology in isolation,” Roesner told UW News.

“As our technologies progress and become even more integral to our lives, the push to consider privacy and security issues will only increase,” she said.

Before joining the UW faculty in 2014, Roesner earned her Ph.D. and Master’s degree from the Allen School working with professor Tadayoshi Kohno, and bachelor’s degrees in computer science and liberal arts from the University of Texas at Austin.

Since 1999, MIT Technology Review has published its annual list of “Innovators Under 35” recognizing exceptional early-career scientists and technologists whose research has the potential to change the world. Past TR35 honorees include Allen School faculty members Shyam Gollakota and Kurtis Heimerl (2014), Jeffrey Heer and Shwetak Patel (2009), and Tadayoshi Kohno (2007), and alumni Kuang Cheng (2014), Noah Snavely (2011), Scott Saponas (2010), Jeffrey Bigham and Adrien Treuille (2009), and Karen Liu and Tapan Parikh (2007).

View Roesner’s TR35 profile here and the full list of 2017 TR35 recipients here.

Congratulations, Franzi!

Security Lab researchers reveal how smart devices can be turned into surveillance devices with music

(Cross-posted from Allen School News.)

CovertBand demo

Researchers from the Allen School’s Networks & Mobile Systems Lab and Security and Privacy Research Lab teamed up on a new project, CovertBand, to demonstrate how smart devices can be converted into surveillance tools capable of secretly tracking the body movements and activities of users and their companions. CovertBand turns off-the-shelf devices into active sonar systems with the help of acoustic pulses concealed in music. The team’s findings reveal how increasingly popular smart home assistants and other connected devices could be used to compromise users’ privacy in their own homes — even from half a world away.

“Most of today’s smart devices including smart TVs, Google Home, Amazon Echo and smartphones come with built-in microphones and speaker systems — which lets us use them to play music, record video and audio tracks, have phone conversations or participate in videoconferencing,” Allen School Ph.D. student and co-lead author Rajalakshmi Nandakumar told UW News. “But that also means that these devices have the basic components in place to make them vulnerable to attack.”

As fellow author and Ph.D. student Alex Takakuwa points out, “Other surveillance approaches require specialized hardware. CovertBand shows for the first time that through-barrier surveillance is possible using no hardware beyond what smart devices already have.”

CovertBand relies on repetitive acoustic pulses in the range of 18 to 20 kHz. While that is typically low enough that most adults are unlikely to pick up on the signals, young people and pets might — and an audible volume is required for more distant surveillance or to pick up activity through walls. To get around this, the team found that they could disguise the pulses under a layer of music, with repetitive, percussive beats the most effective at hiding the additional sound.

Tadayoshi Kohno, Rajalakshmi Nandakumar, Shyam Gollakota

Left to right: Tadayoshi Kohno, Rajalakshmi Nandakumar, and Shyam Gollakota (Not pictured: Alex Takakuwa)

“To our knowledge, this is the first time anyone has demonstrated that it is possible to convert smart commodity devices into active sonar systems using music,” said Allen School professor and co-author Shyam Gollakota.

By connecting a smartphone to a portable speaker or flat-screen TV, the researchers discovered they could use the data collected through CovertBand to accurately identify repetitive movements such as walking, jumping, and exercising up to a distance of six meters within line of sight, and up to three meters through walls. Having proven the concept, researchers believe a combination of more data and the use of machine learning tools would enable rapid classification of a greater variety of movements — and perhaps enable the identification of the individual making them.

With CovertBand, Allen School researchers have identified a plausible threat, given the increasing ubiquity of these devices in our pockets and in our living rooms. But our embrace of emerging technologies needn’t end on a sour note. As professor and co-author Tadayoshi Kohno points out, when it comes to cybersecurity, knowledge is power.

“We’re providing education about what is possible and what capabilities the general public might not know about, so that people can be aware and can build defenses against this,” he said.

The researchers will present a paper detailing their findings at the Ubicomp 2017 conference in Maui, Hawaii next month.

Read the full UW News release here. Learn more and listen to samples of the CovertBand attack music on the project web page here. Check out articles on CovertBand in Fast Company, Digital Trends, New Atlas, and The Register.

Security Lab researchers expose cybersecurity risks of DNA sequencing software

(Cross-posted from Allen School News.)

Lee Organick, Karl Koscher, and Peter Ney prepare the DNA exploit

Left to right: Lee Organick, Karl Koscher, and Peter Ney prepare the DNA exploit.

In an illustration of just how narrow the divide between the biological and digital worlds has become, a team of researchers from the Allen School released a study revealing potential security risks in software commonly used for DNA sequencing and analysis — and demonstrated for the first time that it is possible to infect software systems with malware delivered via DNA molecules. The team will present its paper, “Computer Security, Privacy, and DNA Sequencing: Compromising Computers with Synthesized DNA, Privacy Leaks, and More,” at the USENIX Security Symposium in Vancouver, British Columbia next week.

Many open-source systems used in DNA analysis began in the cloistered domain of the research lab. As the cost of DNA sequencing has plummeted, new medical and consumer-oriented services have taken advantage, leading to more widespread use — and with it, potential for abuse. While there is no evidence to indicate that DNA sequencing software is at imminent risk, the researchers say now would be a good time to address potential vulnerabilities.

“One of the big things we try to do in the computer security community is to avoid a situation where we say, ‘Oh shoot, adversaries are here and knocking on our door and we’re not prepared,’” said professor Tadayoshi Kohno, co-director of the Security and Privacy Research Lab, in a UW News release.

Tabloid headline: "Computer Virus Spreads to Humans!"

Researcher Tadayoshi Kohno wondered if what this tabloid headline suggested would work in reverse: Could DNA be used to deliver a virus to a computer?

Kohno and Karl Koscher (Ph.D., ’14), who works with Kohno in the Security and Privacy Research Lab, have been down this road before — literally as well as figuratively. In 2010, they and a group of fellow UW and University of California, San Diego security researchers demonstrated that it was possible to hack into modern automobile systems connected to the internet. They have also explored potential security vulnerabilities in implantable medical devices and household robots.

Kohno conceived of this latest experiment after he came across an online discussion about a tabloid headline in which a person was alleged to have been infected by a computer virus. While he wasn’t about to take that fantastical storyline at face value, Kohno was curious whether the concept might work in reverse.

Kohno, Koscher, and Allen School Ph.D. student Peter Ney — representing the cybersecurity side of the equation — teamed up with professor Luis Ceze and research scientist Lee Organick of the Molecular Information Systems Lab, where they are working on an unrelated project to create a DNA-based storage solution for digital data. The group decided not only would they analyze existing software for vulnerabilities; they would attempt to exploit them.

“We wondered whether under semi-realistic circumstances it would be possible to use biological molecules to infect a computer through normal DNA processing,” Ney said.

As it turns out, it is possible. The team introduced a known vulnerability into software they would then use to analyze the DNA sequence. They encoded a malicious exploit within strands of synthetic DNA, and then processed those strands using the compromised software. When they did, the researchers were able to execute the encoded malware to gain control of the computer on which the sample was being analyzed.

While there are a number of physical and technical challenges someone would have to overcome to replicate the experiment in the wild, it nevertheless should serve as a wake-up call for an industry that has not yet had to contend with significant cybersecurity threats. According to Koscher, there are steps companies and labs can immediately take to improve the security of their DNA sequencing software and practice good “security hygiene.”

Onscreen output of DNA sequencing machine,

This output from a DNA sequencing machine includes the team’s exploit.

“There is some really low-hanging fruit out there that people could address just by running standard software analysis tools that will point out security problems and recommend fixes,” he suggested. For the longer term, the group’s recommendations include employing adversarial thinking in setting up new processes, verifying the source of DNA samples prior to processing, and developing the means to detect malicious code in DNA.

The team emphasized that people who use DNA sequencing services should not worry about the security of their personal genetic and medical information — at least, not yet. “Even if someone wanted to do this maliciously, it might not work,” Organick told UW News.

While Ceze admits he is concerned by what the team discovered during their analysis, it is a concern that is largely rooted in conjecture at this point.

“We don’t want to alarm people,” Ceze pointed out. “We do want to give people a heads up that as these molecular and electronic worlds get closer together, there are potential interactions that we haven’t really had to contemplate before.”

Visit the project website and read the UW News release to learn more.

Also see coverage in Wired, The Wall Street Journal, MIT Technology Review, The Atlantic, TechCrunch, Mashable, Gizmodo, ZDNet, GeekWire, Inverse, IEEE Spectrum, and TechRepublic.

Eric Zeng at SOUPS 2017

Today Security Lab PhD student Eric Zeng presented at the 13th Symposium on Usable Privacy and Security (SOUPS) in Santa Clara, CA. Eric presented his paper “End User Security & Privacy Concerns with Smart Homes” for which he conducted interviews with fifteen people living in smart homes (twelve smart home administrators and three other residents) to learn about how they use their smart homes, and to understand their security and privacy related attitudes, expectations, and actions. This work was done in collaboration with Security Lab postdoc Shrirang Mare and faculty member Franzi Roesner. Read the full research paper here. Congrats Eric on a great presentation!

Welcome Earlence!

The Security and Privacy Lab is excited to welcome postdoctoral researcher Earlence Fernandes. Earlence’s research focuses on computer security and privacy, with a special interest in cyber-physical systems. His recent work, a security analysis of the popular SmartThings platform, received the Distinguished Practical Paper Award at 2016 IEEE Security and Privacy. Earlence joins us after earning his PhD at the University of Michigan in 2017, where he was advised by Prof. Atul Prakash. Welcome Earlence!

Graduation 2017

Congratulations to Dr. Ada Lerner and Dr. Paul Vines on their PhD graduations today! Prof. Dr. Lerner will soon be joining Wellesley College as an assistant professor and Dr. Vines will be joining BAE Systems as a Principal Research Engineer. Congratulations to all the UW Allen School graduates!

Security Lab researchers shed light on secret surveillance with SeaGlass

(Cross-posted from Allen School News.)

Peter Ney and Ian Smith

Researchers Peter Ney (left) and Ian Smith

A team of researchers in the Allen School’s Privacy & Security Research Lab have developed a new system, SeaGlass, which could bring more transparency and accountability to cell-phone surveillance. SeaGlass is capable of detecting anomalies in the cellular network that may indicate the presence of surveillance devices called IMSI-catchers (also known as cell-site simulators or Stingrays), which track individuals through their International Mobile Subscriber Identity by posing as a legitimate cell tower.

WIRED magazine recently talked to the researchers about SeaGlass and their hope that the findings contribute to the public discourse. From the article:

“Law enforcement’s use of the surveillance devices known as stingrays, fake cell towers that can intercept communications and track phones, remains as murky as it is controversial, hidden in non-disclosure agreements and cloak-and-dagger secrecy. But a group of Seattle researchers has found a new method to track those trackers: by recruiting ridesharing vehicles as surveillance devices of their own.

“For two months last year, researchers at the University of Washington paid drivers of an unidentified ridesharing service to keep custom-made sensors in the trunks of their cars, converting those vehicles into mobile cellular data collectors.”

GIF of SeaGlass

Data collected from a single cell tower in Seattle over two months

The article explains how the information gathered by those mobile data collectors was analyzed to identify irregularities in the network compared to normal cell tower behavior, which could indicate that a surveillance device is active in the area. By using off-the-shelf parts and deploying the system on fleet vehicles that cover a lot of ground during their normal course of business, the researchers demonstrated SeaGlass to be an inexpensive and unobtrusive way to map the cellular landscape.

During the pilot, the system detected three anomalies in the greater Seattle area that piqued the team’s interest. Although the results do not offer conclusive proof of an IMSI-catcher, they do provide a good starting point for further exploration — and for an overdue conversation about surveillance practices.

“Up until now the use of IMSI-catchers around the world has been shrouded in mystery, and this lack of concrete information is a barrier to informed public discussion,” Ph.D. student Peter Ney said in a UW News release. “Having additional, independent and credible sources of information on cell-site simulators is critical to understanding how — and how responsibly — they are being used.”

The team, which also includes Allen School research scientist Ian Smith, Ph.D. student Gabriel Cadamuro, and professor Tadayoshi Kohno, hopes that its work will contribute to a more robust public debate over cell-phone surveillance. They describe in detail the SeaGlass technology and results of the pilot in a paper that will be published this month in the Proceedings on Privacy Enhancing Technologies.

“SeaGlass is a promising technology that — with wider deployment — can be used to help empower citizens and communities to monitor this type of surveillance,” said Smith. “This issue is bigger than one team of researchers. We’re eager to push this out into the community and find partners who can crowdsource more data collection and begin to connect the dots in meaningful ways.”

Read the WIRED story here, the UW News release here, and additional coverage by TechCrunch, Gizmodo, and Engadget. Learn more at the SeaGlass website here.

1 2 3 4 5 20