CryptoDB
Yanhheng Lu
Publications
Year
Venue
Title
2024
TCHES
SHAPER: A General Architecture for Privacy-Preserving Primitives in Secure Machine Learning
Abstract
Secure multi-party computation and homomorphic encryption are two primary security primitives in privacy-preserving machine learning, whose wide adoption is, nevertheless, constrained by the computation and network communication overheads. This paper proposes a hybrid Secret-sharing and Homomorphic encryption Architecture for Privacy-pERsevering machine learning (SHAPER). SHAPER protects sensitive data in encrypted or randomly shared domains instead of relying on a trusted third party. The proposed algorithm-protocol-hardware co-design methodology explores techniques such as plaintext Single Instruction Multiple Data (SIMD) and fine-grained scheduling, to minimize end-to-end latency in various network settings. SHAPER also supports secure domain computing acceleration and the conversion between mainstream privacy-preserving primitives, making it ready for general and distinctive data characteristics. SHAPER is evaluated by FPGA prototyping with a comprehensive hyper-parameter exploration, demonstrating a 94x speed-up over CPU clusters on large-scale logistic regression training tasks.
Coauthors
- Zhaohui Chen (1)
- Zhen Gu (1)
- Qi’ao Jin (1)
- Ziyuan Liang (1)
- Yanhheng Lu (1)
- Zhiyong Wang (1)
- Fan Zhang (1)