Tadayoshi Kohno named IEEE Fellow

(Cross-posted from Allen School News, by Kristin Osborne)

Portrait of a smiling Tadayoshi Kohno wearing a light blue polo shirt standing on warm-toned wooden stairs with metal and glass railings
IEEE Fellow Tadayoshi Kohno’s contributions to security and privacy span multiple domains, from the automobile and medical device industries, to electronic voting and mixed reality.

IEEE honored Kohno for “contributions to cybersecurity” — an apt reference to Kohno’s broad influence across a variety of domains. As co-director of the Allen School’s Security and Privacy Research Lab and the UW Tech Policy Lab, Kohno has explored the technical vulnerabilities and societal implications of technologies ranging from do-it-yourself genealogy research, to online advertising, to mixed reality. 

His first foray into high-profile security research, as a Ph.D. student at the University of California San Diego, struck at the heart of democracy: security and privacy flaws in the software that powered electronic voting machines. What Kohno and his colleagues discovered shocked vendors, elections officials, and other cybersecurity experts.

“Not only could votes and voters’ privacy be compromised by insiders with direct access to the machines, but such systems were also vulnerable to exploitation by outside attackers as well,” said Kohno. “For instance, we demonstrated that a voter could cast unlimited votes undetected, and they wouldn’t require privileged access to do it.”

After he joined the University of Washington faculty, Kohno turned his attention from safeguarding the heart of democracy to an actual heart when he teamed up with other security researchers and physicians to study the security and privacy weaknesses of implantable medical devices. They found that devices such as pacemakers and cardiac defibrillators that rely on embedded computers and wireless technology to enable physicians to non-invasively monitor a patient’s condition were vulnerable to unauthorized remote interactions that could reveal sensitive health information — or even reprogram the device itself. This groundbreaking work earned Kohno and his colleagues a Test of Time Award from the IEEE Computer Society Technical Committee on Security and Privacy in 2019.

“To my knowledge, it was the first work to experimentally analyze the computer security properties of a real wireless implantable medical device,” Kohno recalled at the time, “and it served as a foundation for the entire medical device security field.”

Kohno and a group of his students subsequently embarked on a project with researchers at his alma mater that revealed the security and privacy risks of increasingly computer-dependent automobiles, and in dramatic fashion: by hacking into a car’s systems and demonstrating how it was possible to take control of its various functions.

“It took the industry by complete surprise,” Kohno said in an article published in 2020. “It was clear to us that these vulnerabilities stemmed primarily from the architecture of the modern automobile, not from design decisions made by any single manufacturer … Like so much that we encounter in the security field, this was an industry-wide issue that would require industry-wide solutions.” 

Those industry-wide solutions included manufacturers dedicating new staff and resources to the cybersecurity of their vehicles, the development of new national automotive cybersecurity standards, and creation of a new cybersecurity testing laboratory at the National Highway Transportation Safety Administration. Kohno and his colleagues have been recognized multiple times and in multiple venues for their role in these developments, including a Test of Time Award in 2020 from the IEEE Computer Society Technical Committee on Security and Privacy and a Golden Goose Award in 2021 from the American Association for the Advancement of Science. And most importantly, millions of cars — and their occupants — are safer as a result. 

Kohno has journeyed into other uncharted territory by exploring how to mitigate privacy and security concerns associated with nascent technologies, from mixed reality to genetic genealogy services. For example, he and Security and Privacy Research Lab co-director Franziska Roesner have collaborated on an extensive line of research focused on safeguarding users’ security and privacy in augmented-reality environments. The results include ShareAR, a suite of developer tools for safeguarding users’ privacy while enabling interactive features in augmented-reality environments. They also worked with partners in the UW Reality Lab to organize a summit for members of academia and industry and issue a report exploring design and regulatory considerations for ensuring the security, privacy and safety of mixed reality technologies. Separately, Kohno teamed up with colleagues in the Molecular Information Systems Lab to uncover how vulnerabilities in popular third-party genetic genealogy websites put users’ sensitive personal genetic information at risk. Members of the same team also demonstrated that DNA sequencing software could be vulnerable to malware encoded into strands of synthetic DNA, an example of the burgeoning field of cyber-biosecurity. 

Kohno’s contributions to understanding and mitigating emerging cybersecurity threats extend to autonomous vehicle algorithms, mobile devices, and the Internet of Things. Although projects exposing the hackability of cars and voting machines may capture headlines, Kohno himself is most captivated by the human element of security and privacy research — particularly as it relates to vulnerable populations. For example, he and his labmates recently analyzed the impact of electronic monitoring apps on people subjected to community supervision, also known as “e-carceration.” Their analysis focused on not only the technical concerns but also the experiences of people compelled to use the apps, from privacy concerns to false reports and other malfunctions. Other examples include projects exploring security and privacy concerns of recently arrived refugees in the United States, with a view to understanding how language barriers and cultural differences can impede the use of security best practices and make them more vulnerable to scams, and technology security practices employed by political activists in Sudan in the face of potential government censorship, surveillance, and seizure. 

“I chose to specialize in computer security and privacy because I care about people. I wanted to safeguard people against the harms that can result when computer systems are compromised,” Kohno said. “To mitigate these harms, my research agenda spans from the technical — that is, understanding the technical possibilities of adversaries as well as advancing technical approaches to defending systems — to the human, so that we also understand people’s values and needs and how they prefer to use, or not use, computing systems.”

In addition to keeping up with technical advancements that could impact privacy and security, Kohno is also keen to push the societal implications of new technologies to the forefront. To that end, he and colleagues have investigated a range of platforms and practices, from the development of design principles that would safeguard vulnerable and marginalized populations to understanding how online political advertising contributes to the spread of misinformation, to educate and support researchers and developers of these technologies. He has also attempted to highlight the ethical issues surrounding new technologies through a recent foray into speculative and science fiction writing. For example, his self-published novella “Our Reality” explores how mixed reality technologies designed with a default user in mind can have real-world consequences for people’s education, employment, access to services, and even personal safety.

“It’s important as researchers and practitioners to consider the needs and concerns of people with different experiences than our own,” Kohno said. “I took up fiction writing for the joy of it, but also because I wanted to enable educators and students to explore some of the issues raised by our research in a more accessible way. Instead of identifying how a technology might have gone wrong, I want to help people focus from the start on answering the question, ‘how do we get this right?’”