Upgrade 2020: CIS LAB


Dan Boneh

Professor, Applied Cryptography and Computer Security, Stanford University

Comparing Two Approaches to Updatable Encryption: Lattice vs. Symmetric

Efficient and secure encryption key rotation is a longstanding issue with respect to storing sensitive data in the cloud, one that Stanford University Professor Dan Boneh has been studying for years. He presented some of his most recent findings at Upgrade 2020, the NTT Research Summit. His solution is updatable encryption, which involves taking old and new encryption keys and combining them using a re-key generation algorithm. Such algorithms create a short key that’s a combination of the old and new keys. You can then send that key to the cloud and the provider uses it to re-encrypt the ciphertext, without ever seeing it in the clear. That’s not exactly a new concept. Boneh was part of a team that wrote a paper on it in 2014. Others have also worked on the topic since. In his recent work, Boneh and several of his students – Saba Eskandarian, Sam Kim and Maurice Shih – set out to look at more efficient constructions for updatable encryption using key homomorphic PRFs (pseudorandom functions).

Amit Sahai

Professor, Computer Science, UCLA

Sound Crypto Assumptions Prove Existence of Indistinguishability Obfuscation

Dr. Amit Sahai presented a solution to a problem that’s been confounding the crypto world for years: Indistinguishability obfuscation, or a way to obfuscate computer programs such that people can use the programs but never get a glimpse into the code that makes them work.


“There have been over 100 papers written that show how to use IO to achieve a number of remarkable cryptographic goals that really expand the scope of cryptography in addition to doing just remarkable, really interesting new things,” Sahai said. But all known constructions of IO required hardness assumptions designed to prove the IO was secure.


“Unfortunately, this has a tortuous history and many of the assumptions were actually broken, which led to a lot of doubt and uncertainty about the status of IO, whether it really exists or doesn’t exist,” said Sahai, who is Director of the Center for Encrypted Functionalities at the University of California Los Angeles (UCLA).


His work “changes that state of affairs in a fundamental way,” by building IO based on four variations of widely used cryptographic assumptions. His talk goes into depth on each of them.

Ran Canetti headshot placeholder

Ran Canetti

Professor, Computer Science, Boston University

Ran Canetti Provides Proof of Fully Deniable Encryption and Computation

Boston University Computer Science Professor Ran Canetti gave a presentation at the summit summarizing his recent paper that proves deniable encryption is possible even if both parties to a communication are compromised.


Back in 1997 Canetti and his collaborators published a paper (Canetti et al., Crypto 1996) proving the concept of deniable encryption, where the secrecy of a communication is protected even if one of the parties is coerced or bribed into giving up their encryption keys. Left open was the question of whether deniable communication is still possible if both parties are coerced.


With his latest paper, co-authored with Sunoo Park and Oxana Poburinnaya, Canetti provides an answer: yes.


In his talk, Canetti brings the complex topic down to earth using an analogy involving Violet and Jack-Jack Parr of The Incredibles.  He also goes into the details and formulas that demonstrate how the encryption scheme works.


Hoteck Wee

Senior Scientist, NTT Research

A New Approach for Computing Attribute Weighted Sums over Encrypted Data

A complex problem in functional encryption cryptography is calculations involving weighted sums, where the output of a function is an arbitrary weight. NTT Research Senior Scientist Hoeteck Wee gave a talk that spelled out an encryption scheme that tackles just that problem.


For example, say you want to find individuals over the age of 40 in the state of Wisconsin who watch Fox News, he said. This group could be represented by the predicate to the public value xi. Now suppose you want to find the folks in group who responded in a certain way to the poll, zi.


This is where things get interesting. Wee and his team look at the problem from the point of view of functional encryption, landing on a solution that defines a scheme for attribute-weighted sums that apply to a large class of programs.


Elaine Shi

Professor, Cornell and Carnegie Mellon University

Breaking the Speed-of-Light Barrier of Sorting Circuits

When Ajtai, Komlos and Szemeredi described the AKS Sorting Network in 1983, resulting in an O(n log n) sorting circuit, few could have imagined that there would be little to no further understanding and advancements in this field for almost 40 years. It seems like n log n might possibly be a speed-of-light barrier for sorting circuits.


But in her talk at the NTT Research Summit, Dr. Elaine Shi describes how to construct sorting circuits that overcome the n log n barrier for short keys.


Shi, a professor at Carnegie Mellon University (on leave from Cornell), describes results over the past 3 years that push forward the frontier of our understanding regarding sorting circuits, including new upper- and lower-bounds. 

In her talk, she shows how to construct an O(n k)-sized sorting circuit that sorts n elements each with a k-bit key, and also a matching lower bound showing its optimality for every k. 


Dr Shi’s approach is the first to show any non-trivial result using non-comparison-based techniques in the circuit model.

Daniel Wichs headshot

Daniel Wichs

Senior Scientist, NTT Research; Associate Professor of Computer Science, Northeastern University

A New Take on Big-Key Encryption and Incompressible Encodings: Using Useful Data

In his talk, Dr. Daniel Wichs posed an interesting question: whether we can come up with an incompressible representation of a large public data set, such as the entire Wikipedia site.


Wichs, a senior scientist with NTT Research, noted Wikipedia is some 50 gigabytes in size. The challenge he addressed was representing Wikipedia such that the representation requires the full 50GB of storage.


That gets to the idea behind big-key cryptography, which is to mitigate against stolen keys by making the keys many gigabytes or even terabytes in size, such that it requires too much bandwidth to download or is not cost-effective for the intruder to have to store so much data.


But that creates another problem: it’s also not cost-effective for the user to store all that data, either. That is, unless the data is useful for some other purpose other than providing security. Perhaps the data in question is a user’s movie collection, for example

In his talk, Wichs laid out the thinking behind his work, including the use of a cryptographic object that uses a relaxed notion of Lossy Trapdoor Permutations.


Arnab Roy

Research Manager, Fujitsu Labs of America

Arnab Roy Presents a More Efficient Approach to NIZK: Quasi Adaptive 

Zero-Knowledge protocol is a valuable cryptographic tool in that it enables the proof of ownership of a digital object without revealing the digital contents of that object, or “the secret.” But implementations of zero-knowledge encryption can be computationally inefficient. It was this problem that Arnab Roy, Research Manager, Fujitsu Laboratories, presented at the Upgrade 2020 event.  


For the purposes of his talk, and the paper on which it was based, Roy focused on Non-Interactive Zero-Knowledge, or NIZK, which was first developed by Blum, Feldman and Micali (1988). While general Zero-Knowledge protocols can involve multiple rounds of back-and-forth communications, NIZK allows only a single message to be transmitted.


His work focused on creating shorter NIZKs, to increase efficiency at the cost of having a tailored common reference string (CRS) for each given language.” The solution he came up with involved language constants embedded into the CRS in such a way that the object functions as its own commitment.


Ilan Komargodski

Scientist, NTT Research

SPARK: A Proposal for Parallelizing Complex Computational Proofs 

Computational proofs have become increasingly important over the years, due to our highly networked world where you might want to “outsource” parts of your own computational work to cloud vendors. To illustrate the problem, imagine you want to calculate Pi to 5,000,000 digits, then use the result of this calculation in your own software to calculate something else. As a caveat, imagine if you have an error in your Pi calculation, your resulting calculation will become useless.


Calculating PI to 5,000,000 digits is a task that’s tailor-made for cloud computing, so you want to outsource it to AWS or Azure. However, how can you trust the cloud vendor not to take shortcuts, and simply create made-up numbers so it doesn’t have to spend the CPU cycles performing actual calculations? 


In his talk, Ilan Komargodski proposed a solution he refers to as SPARK, an acronym for “Succinct Parallelizable Arguments of Knowledge,” based on joint research performed with Naomi Ephraim, Cody Freitag and Rafael Pass. Their proposal significantly reduces the time required for the cloud vendor to provide proofs that it performed the calculation accurately and didn’t just make up results.


Yilei Chen

Staff Research Scientist, Visa Research

A Practical Approach to Effective Cryptography in a Post-Quantum Computing World

Cryptography trapdoors, implying mathematical functions that are easy to calculate but hard to reverse, are the foundation of modern cryptography. The schoolbook example is how multiplying two extremely large prime numbers is easy from a computational perspective, while factorizing the result of the multiplication back to its original two prime numbers is considered difficult.


In his talk, Yelei Chen from Visa Research talks about hard elliptic curve isogeny problems over RSA moduli and groups with infeasible inversion. Chen’s work expands upon work initiated by Molnar and Hohenberg in 2003 and further continued by Irrer et al in 2004.


An elliptic curve isogeny is a mapping from one elliptic curve to another. Chen’s work is easily visualized by imagining an elliptic curve isogeny as a volcano and using the intersections in the isogeny graph as cryptography primitives. His work illustrates how to create cryptography primitives using his approach and elliptic curve isogenies over RSA moduli that are easy to compose and calculate, but very hard to reversely calculate – implying a simple composition, yet infeasible inversion.