Usable Security and Privacy Projects

Foregrounding Bystanders as Stakeholders in Smart Home Product Design

As computing advances, we are faced with tough decisions like how to balance individual privacy with the potential for innovation. People are often uncomfortable with how data is collected and used, yet we continue to see new data-driven technologies deployed. The oft-touted approach of transparency and control has not been an effective solution to individual privacy. People are ill-equipped to decipher how systems work, so cannot effectively use tools intended to put them in control. And as technology expands beyond devices for individuals, privacy expands beyond individual choice.

Narrowing The Gap Between Privacy Expectations and Reality in Mobile Health

ICSI and St. Mary's College are collaborating on an NSF-funded project that seeks to answer important questions about privacy and security practices in mobile health technologies (mHealth), such as health apps.

Exploring the Boundaries of Passive Listening in Voice Assistants

Various forms of voice assistants—stand-alone devices or those built into smartphones—are becoming increasingly popular among consumers. Currently, these systems react when you directly speak to them using a specific wake-word, such as “Alexa,” “Siri,” “Ok Google.” However, with advancements in speech recognition, the next generation of voice assistants is expected to always listen to the acoustic environment and proactively provide services and recommendations based on human conversations or other audio signals, without being explicitly invoked.

Mobile Dynamic Privacy and Security Analysis at Scale

Current approaches for detecting suspicious application activity on mobile platforms rely on static analysis: reverse-engineering and examining sequences of program code to infer application behavior. This method invariably falls short in that it can only detect what behaviors or capabilities a program might have, and not whether and to what extent a program actually engages in these behaviors. It is also susceptible to code obfuscation techniques commonly used by many mobile applications.

Increasing Users' Cyber-Security Compliance by Reducing Present Bias

Despite recent advances in increasing computer security through automation, there are still situations in which humans must manually perform computer security tasks. These tasks may include enabling automatic updates, rebooting machines to apply those updates, configuring automatic backups, or enrolling in two-factor authentication. However, despite viewing these tasks as important for security, many people still choose to ignore them. Two decades of usable security research have shown that these tasks are often seen as annoyances because they are almost never the user's primary objective.

The Science of Privacy: Implications for Data Usage

The International Computer Science Institute (ICSI) in Berkeley, CA is the home to one of six NSA-funded lablets focused on security and privacy research. ICSI's lablet is led by Dr. Serge Egelman, head of the Usable Security and Privacy Research group at ICSI, and includes collaborators at Cornell Tech and UC Berkeley. Other lablets are centered at University of Kansas, Vanderbilt University, Carnegie-Mellon University, University of Illinois-Champaign, and North Carolina State University.

Previous Work: Scaling Contextual Privacy to MDM Environments

It has long been understood that privacy and usability are often in tension: the privacy controls that are often mandated by workplace environments are difficult to use, which results in either low rates of compliance, negative impacts on job performance (e.g., being unable to perform various tasks due to access control restrictions), or inadvertent disclosure of sensitive information (i.e., privacy breaches).

Previous Work: Usable Security of Emerging Healthcare Technologies for Seniors

Older adults (65+) are becoming primary users of technologies, including emerging smart systems, especially in health care. However, such technologies are often not designed for older users and can pose serious privacy and security concerns due to their novelty, complexity, and propensity to collect vast amounts of sensitive information.

Teaching Security

The Teaching Security project is providing classroom-ready materials to support high-school teachers in teaching about important cybersecurity principles, helping students understand the major vulnerabilities, why they occur, and what defensive strategies can be used. The materials focus on inquiry-based activities and hands-on interactive apps and demos that allow students to explore for themselves how cybersecurity works.

AppCensus: Learn the Privacy Costs of Free Apps

There exists a mature ecosystem of developers and service providers that produce mobile applications, often offering them at zero up-front cost. These free apps are supported by advertising networks, who distribute software libraries that developers use for drop-in integration of ad delivery and audience tracking functionality. However, integrated advertiser code and core application code run with the same access privileges, a security and privacy risk not readily apparent to end-users and developers alike.

Previous Work: Using Individual Differences to Personalize Security Mitigations

Researchers at ICSI are leveraging well-studied individual differences in the psychology literature in order to improve computer security outcomes. Specifically, they are looking at how people with different decision-making styles may be more or less receptive to different types of security messaging. Applying techniques from behavioral economics, the goal is to frame security mitigations for individual users so that they see the security messages that are most likely to have an effect on them.

Previous Work: Mobile Contextual Privacy

This project is rethinking how smartphones grant third-party applications access to sensitive user data. Currently, mobile platforms ask the user for permission the first time an application attempts to access certain data types; when access is granted, the user is never asked to make this decision again, even if the context in which subsequent data requests occur are substantially different from the context of the first request. For example, no distinction is made between using location data for location-based features and user tracking.

Previous Work: Security and Privacy for Wearable and Continuous Sensing Platforms

In this collaborative project, researchers at ICSI, UC Berkeley, and University of Washington are systematically exploring the security and privacy issues brought up by the increasing popularity of wearable computers. The recent demand for devices like Google Glass, smart watches, and wearable fitness monitors suggests that wearable computers may become as ubiquitous as cellphones.

Science of Security

In this collaborative project, researchers at ICSI are utilizing Carnegie Mellon University's Security Behavior Observatory (SBO) infrastructure to conduct quantitative experiments about how end-users make security decisions. The results of these experiments are used to design new security mitigations and interventions, which are then iteratively evaluated in the laboratory and the field. This collaboration is designed to provide keen insights into how users make security decisions in situ.

Previous Work: Teaching Resources for Online Privacy Education (TROPE)

Researchers are developing classroom-ready teaching modules to educate young people about why and how to protect their privacy online, as well as a Teachers' Guide with background information, suggested lesson plans, and guidance on how to employ the modules in the classroom.

Previous Work: Privacy Literacy with San Jose Public Library

ICSI researchers are collaborating with the San Jose Public Library and San Jose State University's Game Development club to develop an online tool which will help individuals understand privacy in the digital age and make informed decisions about their online activity. Beyond the standard educational aid, this tool will be non-biased, acknowledging that people have many different definitions of privacy and may have different needs based on what kind of online persona they have created.