Projects
ICSI hosts basic, pre-competitive research of fundamental importance to computer science and engineering. Projects are chosen based on the interests of the Institute’s principal investigators and the strengths of its researchers and affiliated UC Berkeley faculty.
Recent projects are listed below; the full list of each group's projects is accessible via the links listed in the sidebar.
Large-scale physics simulations pose a significant challenge on the currently available computational resources, because of the costs of both communication and storage largely exceeding the cost of the actual computation. The efficient management of the Exascale data flows generated by a large-scale simulation is still an unsolved problem. This project aims to provide an initial solution to this problem.
The dramatic increase in our abilities to observe massive amounts of measurements coming from distributed and disparate high-resolution sensors have been instrumental in enhancing our understanding of many physical phenomena. Signal processing (SP) has been the primary driving force in this knowledge of the unseen from observed measurements. However, in the last decade, the exponential increase in observations has outpaced our computing abilities to process, understand, and organize this massive but useful information.
The difficulty of automated testing and discovery in interoperability depends on information explicitly known. Interoperability remains a challenging unsolved problem that depends on manual error-prone solutions and costs billions annually. The goal of this research is to investigate automated approach to verification and discovery of interoperability based on recently developed theory of property-based interoperability. This may enable the next generation of automatically composable and reconfigurable systems.
Funding provided by DARPA
Current approaches for detecting suspicious application activity on mobile platforms rely on static analysis: reverse-engineering and examining sequences of program code to infer application behavior. This method invariably falls short in that it can only detect what behaviors or capabilities a program might have, and not whether and to what extent a program actually engages in these behaviors. It is also susceptible to code obfuscation techniques commonly used by many mobile applications.
Despite recent advances in increasing computer security through automation, there are still situations in which humans must manually perform computer security tasks. These tasks may include enabling automatic updates, rebooting machines to apply those updates, configuring automatic backups, or enrolling in two-factor authentication. However, despite viewing these tasks as important for security, many people still choose to ignore them. Two decades of usable security research have shown that these tasks are often seen as annoyances because they are almost never the user's primary objective.
The Internet has ushered in a new era of communication, and has supported an ever-growing set of applications that have transformed our lives. It is remarkable that all this has taken place with an Internet architecture that has remained unchanged for over forty years. While unfortunate, some view this architectural stagnation as inevitable. After all, it has long been a central tenet that the Internet needs a "narrow waist" at the internetworking layer (L3), a single uniform protocol adopted by everyone; given this assumption, changing this layer is inevitably hard.
Datacenters have redefined the nature of high-end computing, but harnessing their computing power remains a challenging task. Initially, programming frameworks such as MapReduce, Hadoop, Spark, TensorFlow, and Flink provided a way to run large-scale computations. These frameworks took care of the difficult issues of scaling, fault-tolerance, and consistency, freeing the developer to focus on the logic of their particular application. However, each of these frameworks were aimed at a specific computational task (e.g., machine learning, data analytics, etc.), and are not fully general.
When the DNS fails, nothing works. One does not need to look beyond many real-world advertising campaigns to appreciate that naming is one of the foundational elements upon which most higher layer Internet services are built. We use names as rendezvous points between users and services (e.g., www.twitter.com). Yet, we do not use names directly in traffic routing. Rather, we turn names into IP addresses via the Domain Name System (DNS). A DNS lookup is therefore a prerequisite for most Internet transactions.
The amount of data in our world has exploded, with data being at the heart of modern economic activity, innovation, and growth. In many cases, data are modeled as matrices, since an m x n matrix A provides a natural structure to encode information about m objects, each of which is described by n features. As a result, linear algebraic algorithms, and in particular matrix decompositions, have proven extremely successful in the analysis of datasets in the form of matrices.
The International Computer Science Institute (ICSI) in Berkeley, CA is the home to one of six NSA-funded lablets focused on security and privacy research. ICSI's lablet is led by Dr. Serge Egelman, head of the Usable Security and Privacy Research group at ICSI, and includes collaborators at Cornell Tech and UC Berkeley. Other lablets are centered at University of Kansas, Vanderbilt University, Carnegie-Mellon University, University of Illinois-Champaign, and North Carolina State University.
It has long been understood that privacy and usability are often in tension: the privacy controls that are often mandated by workplace environments are difficult to use, which results in either low rates of compliance, negative impacts on job performance (e.g., being unable to perform various tasks due to access control restrictions), or inadvertent disclosure of sensitive information (i.e., privacy breaches).
Older adults (65+) are becoming primary users of technologies, including emerging smart systems, especially in health care. However, such technologies are often not designed for older users and can pose serious privacy and security concerns due to their novelty, complexity, and propensity to collect vast amounts of sensitive information.
The Teaching Security project is providing classroom-ready materials to support high-school teachers in teaching about important cybersecurity principles, helping students understand the major vulnerabilities, why they occur, and what defensive strategies can be used. The materials focus on inquiry-based activities and hands-on interactive apps and demos that allow students to explore for themselves how cybersecurity works.
There exists a mature ecosystem of developers and service providers that produce mobile applications, often offering them at zero up-front cost. These free apps are supported by advertising networks, who distribute software libraries that developers use for drop-in integration of ad delivery and audience tracking functionality. However, integrated advertiser code and core application code run with the same access privileges, a security and privacy risk not readily apparent to end-users and developers alike.
Increasingly, decisions and actions affecting people's lives are determined by automated systems processing personal data. Excitement over the positive contributions of these systems has been accompanied by serious concerns about their opacity and the threats that they pose to privacy, fairness, and other values. Recognizing these concerns, this project seeks to enable real-world automated decision-making systems to be accountable for privacy and fairness.
Self-organizing network (SON) algorithms that were designed for the self-configuration, self-optimization, and self-healing of today's 4G networks exhibit various drawbacks. The two most severe drawbacks are (1) so-called SON use case coordination - the coordination of conflicting network parameter changes - which can lead to sub-optimal network configurations and, more likely, to a worsening of network performance, and (2) a qualitative and quantitative lack of input information to the SON, making reliable network management cumbersome.
This collaborative project led by LLNL is part of the DARPA TRAnsforming DESign (TRADES) program, and focuses on modeling, analysis and synthesis of complex parameterized multi-scale material structures.
The large-scale data being generated in many application domains promise to revolutionize scientific discovery, engineering and technological development, social science understanding, and our ability to monitor masses and influence behavior in subtle ways. In most applications, however, this promise has yet to be fulfilled. One major reason for this is the dificulty of using, in a low-friction manner, cutting-edge algorithmic and statistical tools to explore the data and develop domain-informed models of the processes generating the data.
In this collaborative project with UC Berkeley, ICSI PIs are working with the lead developer of the "Snowflake" censorship circumvention system to refine the code for production deployment in both the Tor Browser Bundle and as a stand-alone application. The work includes developing instrumentation to measure the usage of Snowflake as its deployment rolls out and analyzing the results to assess Snowflake's impact on enabling circumvention.
One of the Internet’s greatest strengths is the degree to which it facilitates access to any of its resources from users anywhere in the world. Various forces, however, have arisen that restrict particular users from accessing particular destinations, resulting in a "balkanization" of the network. This project explores apt methodologies for understanding such balkanization, an effort we will undertake in the context of examining "regional discrimination," i.e., the degree to which certain web services deny or degrade access to users from particular geographic regions.