Paper reading list

This page serves as my paper reading list related to my research area. There are many kinds of security issues related to neural networks. Although the most famous one is using adversary samples to trick neural networks, my interest focuses on the security issues of the neural network itself and the privacy of user input data. More information about me can be found on my home page.


Homomorphic Encryption

Daniele Micciancio's Lattice Cryptography Links and Vinod Vaikuntanathan FHE links have a full list of papers on major HE schemes and available libraries. This post has details to help understand details about CKKS.

Used in HElib.

Used in SEAL.

Used in HEAAN.

Use sine function to approximate mod operation. Swap slots with coefficients in ciphertext during bootstrapping.

Use Chebyshev polynomials to approximate mod operation.


Secure Multi-Party Computation (SMPC)

A good library for Arithmetic sharing, Boolean sharing, and Yao’s garbled circuits.

Arithmetic Sharing

Boolean Sharing

Garbled Circuits

Detailed introduction on basic Yao's garbled circuits and optimizations.

Brief introduction on basic Yao's garbled circuits and optimizations.

Slides on basic Yao's garbled circuits.

Oblivious Transfer


Neural Networks

HE + NN

First work to use HE for NN inference. Modifications to NNs: (1) square activation; (2) sum pooling.

Based on Cryptonets, add batchnormalization layers, use higher degree polynomials to approximate ReLU.

(1) Use TFHE as HE library to encrypt input bit by bit. (2) Quantize weights to power of 2. (3) Use HE gates to build ReLU and maxpooling functions.

SMPC + NN

Data pre-precessing. Model pruning. Client can use a proxy server for garbling the circuits. Leverage the industrial synthesis tool. Use sequential circuits as opposed to combinational circuits

HE + SMPC + NN

Use ABY library for 2PC. Use SEAL for HE. Use HE to generate shares. Use Arithmetic Sharing for matrix vector multiplication. Use GC for ReLU and sigmoid. Use Arithmetic Sharing for mean pooling. Use GC for max pooling. Use Arithmetic Sharing for square activations. Use GC for rescaling values within range.

Use HE to compute convolutional and fully connected layers, use GC to compute activation and pooling layers.

Reorder slots in ciphertexts.

Watermarks + NN

Hardware + NN

Use hardware information as fingerprints to protect NN models.

Same idea with AEP. Use eDRAM startup values as PUFs.

Quantize to integers of power of 2. Add random noise to protext weights.

Attacks