Abstract
The emergence of vertical federated learning (VFL) solves the problem of joint modeling between participants sharing the same ID space and different feature spaces. Privacy-preserving (PP) VFL is challenging because complete sets of labels and features are not owned by the same entity, and more frequent and direct interactions are required between participants. The existing VFL PP schemes are often limited by the communication cost, the model types supported, and the number of participants. We propose FLFHNN, a novel PP framework for heterogeneous neural networks based on CKKS fully homomorphic encryption (FHE). Combining the advantages of FHE in supporting types of ciphertext calculation, FLFHNN eliminates the limitation that the algorithm only supports limited generalized linear models and realizes “short link” communication between participants, and adopts the training and inference of encrypted state to ensure confidentiality of the shared information while solving the problem of potential leakage from the aggregated values of federated learning. In addition, FLFHNN supports flexible expansion to multi-party scenarios, and its algorithm adapts according to the number of participants. Our analysis and experiments demonstrate that compared with Paillier based scheme, FLFHNN significantly reduces the communication cost of the system on the premise of retaining the accuracy of the model, and the required interactions and information transmission for training are reduced by almost 2/3 and more than 30% respectively, which is suited to large-scale internet of things scenarios.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
McMahan, H.B., et al.: Federated learning of deep networks using model averaging. CoRR abs/1602.05629 (2016)
Konečný, J., et al.: Federated learning: strategies for improving communication efficiency. CoRR abs/1610.05492 (2016)
McMahan, B., et al.: Communication-efficient learning of deep networks from decentralized data. In: AISTATS, pp. 1273–1282. PMLR (2017)
Ramaswamy, S., et al.: Federated learning for emoji prediction in a mobile keyboard. CoRR abs/1906.04329(5) (2019)
Yang, Q., et al.: Federated machine learning: concept and applications. ACM Trans. Intell. Syst. Technol. 10(2), 12:1–12:19 (2019)
Zhu, L., Han, S.: Deep leakage from gradients. In: Yang, Q., Fan, L., Yu, H. (eds.) Federated Learning. LNCS (LNAI), vol. 12500, pp. 17–31. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-63076-8_2
Phong, L.T., et al.: Privacy-preserving deep learning via additively homomorphic encryption. IEEE Trans. Inf. Forensics Secur. 13(5), 1333–1345 (2018)
Melis, L., et al.: Exploiting unintended feature leakage in collaborative learning. In: SP, pp. 691–706. IEEE (2019)
Hitaj, B., et al.: Deep models under the GAN: information leakage from collaborative deep learning. In: CCS, pp. 603–618. ACM (2017)
Wang, Z., et al.: Beyond inferring class representatives: user-level privacy leakage from federated learning. In: INFOCOM, pp. 2512–2520. IEEE (2019)
Gascón, A., et al.: Secure linear regression on vertically partitioned datasets. IACR Cryptology ePrint Archive 892 (2016)
Chen, T., et al.: VAFL: a method of vertical asynchronous federated learning. CoRR abs/2007.06081 (2020)
Wang, C., et al.: Hybrid differentially private federated learning on vertically partitioned data. CoRR abs/2009.02763 (2020)
Hardy, S., et al.: Private federated learning on vertically partitioned data via entity resolution and additively homomorphic encryption. CoRR abs/1711.10677 (2017)
Yang, K., et al.: A quasi-newton method based vertical federated learning framework for logistic regression. CoRR abs/1912.00513 (2019)
Nasr, M., et al.: Comprehensive privacy analysis of deep learning: passive and active white-box inference attacks against centralized and federated learning. In: SP, pp. 739–753. IEEE (2019)
WeBank AI Department. https://github.com/FederatedAI/FATE. Accessed 9 May 2022
Zhang, Y., Zhu, H.: Additively homomorphical encryption based deep neural network for asymmetrically collaborative machine learning. CoRR abs/2007.06849 (2020)
Zhang, Q., et al.: GELU-Net: a globally encrypted, locally unencrypted deep neural network for privacy-preserved learning. In: IJCAI, pp. 3933–3939. ijcai.org (2018)
Gu, B., et al.: Federated doubly stochastic kernel learning for vertically partitioned data. In: KDD, pp. 2483–2493. ACM (2020)
Zhang, Q., et al.: Secure bilevel asynchronous vertical federated learning with backward updating. In: AAAI, pp. 10896–10904. AAAI Press (2021)
Kim, M., et al.: Secure logistic regression based on homomorphic encryption. IACR Cryptology ePrint Archive 74 (2018)
Damgård, I., Jurik, M.: A generalisation, a simplification and some applications of Paillier’s probabilistic public-key system. In: Kim, K. (ed.) PKC 2001. LNCS, vol. 1992, pp. 119–136. Springer, Heidelberg (2001). https://doi.org/10.1007/3-540-44586-2_9
Gilad-Bachrach, R., et al.: CryptoNets: applying neural networks to encrypted data with high throughput and accuracy. In: ICML, pp. 201–210. JMLR.org (2016)
Hesamifard, E., et al.: CryptoDL: deep neural networks over encrypted data. CoRR abs/1711.05189 (2017)
Chou, E., et al.: Faster CryptoNets: leveraging sparsity for real-world encrypted inference. CoRR abs/1811.09953 (2018)
Fan, J., Vercauteren, F.: Somewhat practical fully homomorphic encryption. IACR Cryptology ePrint Archive 144 (2012)
Zhang, G.-D., et al.: Feature-distributed SVRG for high-dimensional linear classification. CoRR abs/1802.03604 (2018)
Cheon, J.H., Kim, A., Kim, M., Song, Y.: Homomorphic encryption for arithmetic of approximate numbers. In: Takagi, T., Peyrin, T. (eds.) ASIACRYPT 2017. LNCS, vol. 10624, pp. 409–437. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-70694-8_15
OpenMined. https://github.com/OpenMined/TenSEAL. Accessed 9 May 2022
UCI Machine Learning Repository. https://archive.ics.uci.edu/ml/datasets.php. Accessed 9 May 2022
Acknowledgements
This work is supported by the Cooperation project between Chongqing Municipal undergraduate universities and institutes affiliated to CAS (HZ2021015).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Sun, H., Zhang, Y., Li, M., Xu, Z. (2022). FLFHNN: An Efficient and Flexible Vertical Federated Learning Framework for Heterogeneous Neural Network. In: Wang, L., Segal, M., Chen, J., Qiu, T. (eds) Wireless Algorithms, Systems, and Applications. WASA 2022. Lecture Notes in Computer Science, vol 13471. Springer, Cham. https://doi.org/10.1007/978-3-031-19208-1_28
Download citation
DOI: https://doi.org/10.1007/978-3-031-19208-1_28
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-19207-4
Online ISBN: 978-3-031-19208-1
eBook Packages: Computer ScienceComputer Science (R0)