Abstract
Fully homomorphic encryption (FHE) is an encryption method that allows to perform computation on encrypted data, without decryption. FHE preserves the privacy of the users of online services that handle sensitive data, such as health data, biometrics, credit scores and other personal information. A common way to provide a valuable service on such data is through machine learning and, at this time, neural networks are the dominant machine learning model for unstructured data.
In this work we show how to construct Deep Neural Networks (DNN) that are compatible with the constraints of TFHE, an FHE scheme that allows arbitrary depth computation circuits. We discuss the constraints and show the architecture of DNNs for two computer vision tasks. We benchmark the architectures using the Concrete stack (https://github.com/zama-ai/concrete-ml), an open-source implementation of TFHE.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Aharoni, E., et al.: HE-PEx: efficient machine learning under homomorphic encryption using pruning, permutation and expansion. CoRR abs/2207.03384 (2022). https://doi.org/10.48550/arXiv.2207.03384
Albrecht, M.R., Player, R., Scott, S.: On the concrete hardness of learning with errors. Cryptology ePrint Archive, Paper 2015/046 (2015). https://eprint.iacr.org/2015/046
Bergerat, L., et al.: Parameter optimization & larger precision for (T)FHE. Cryptology ePrint Archive, Paper 2022/704 (2022). https://eprint.iacr.org/2022/704
Bos, J.W., Lauter, K., Loftus, J., Naehrig, M.: Improved security for a ring-based fully homomorphic encryption scheme. In: Stam, M. (ed.) IMACC 2013. LNCS, vol. 8308, pp. 45–64. Springer, Heidelberg (2013). https://doi.org/10.1007/978-3-642-45239-0_4
Bourse, F., Minelli, M., Minihold, M., Paillier, P.: Fast homomorphic evaluation of deep discretized neural networks. In: Shacham, H., Boldyreva, A. (eds.) CRYPTO 2018. LNCS, vol. 10993, pp. 483–512. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-96878-0_17
Brutzkus, A., Gilad-Bachrach, R., Elisha, O.: Low latency privacy preserving inference. In: Chaudhuri, K., Salakhutdinov, R. (eds.) Proceedings of the 36th International Conference on Machine Learning, ICML 2019, 9–15 June 2019, Long Beach, California, USA. Proceedings of Machine Learning Research, vol. 97, pp. 812–821. PMLR (2019). http://proceedings.mlr.press/v97/brutzkus19a.html
Cheon, J.H., Kim, A., Kim, M., Song, Y.: Homomorphic encryption for arithmetic of approximate numbers. In: Takagi, T., Peyrin, T. (eds.) ASIACRYPT 2017. LNCS, vol. 10624, pp. 409–437. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-70694-8_15
Chillotti, I., Gama, N., Georgieva, M., Izabachène, M.: TFHE: fast fully homomorphic encryption over the torus. J. Cryptol. 33, 34–91 (2019)
Chillotti, I., Joye, M., Ligier, D., Orfila, J.B., Tap, S.: CONCRETE: concrete operates on ciphertexts rapidly by extending TFHE. In: WAHC 2020–8th Workshop on Encrypted Computing & Applied Homomorphic Cryptography, vol. 15 (2020)
Chillotti, I., Joye, M., Paillier, P.: Programmable bootstrapping enables efficient homomorphic inference of deep neural networks. IACR Cryptol. ePrint Arch. 2021, 91 (2021)
Folkerts, L., Gouert, C., Tsoutsos, N.G.: REDsec: running encrypted discretized neural networks in seconds. In: Proceedings 2023 Network and Distributed System Security Symposium (2023)
Gilad-Bachrach, R., Dowlin, N., Laine, K., Lauter, K., Naehrig, M., Wernsing, J.: CryptoNets: applying neural networks to encrypted data with high throughput and accuracy. In: Balcan, M.F., Weinberger, K.Q. (eds.) Proceedings of The 33rd International Conference on Machine Learning. Proceedings of Machine Learning Research, vol. 48, pp. 201–210. PMLR, New York (2016). https://proceedings.mlr.press/v48/gilad-bachrach16.html
Jacob, B., et al.: Quantization and training of neural networks for efficient integer-arithmetic-only inference. In: 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 2704–2713 (2017)
Joye, M.: Guide to fully homomorphic encryption over the [discretized] torus. Cryptology ePrint Archive, Paper 2021/1402 (2021). https://eprint.iacr.org/2021/1402, https://eprint.iacr.org/2021/1402
Lee, E., et al.: Low-complexity deep convolutional neural networks on fully homomorphic encryption using multiplexed parallel convolutions. In: Chaudhuri, K., Jegelka, S., Song, L., Szepesvari, C., Niu, G., Sabato, S. (eds.) Proceedings of the 39th International Conference on Machine Learning. Proceedings of Machine Learning Research, vol. 162, pp. 12403–12422. PMLR (2022). https://proceedings.mlr.press/v162/lee22e.html
Lou, Q., Jiang, L.: SHE: a fast and accurate deep neural network for encrypted data. In: Wallach, H., Larochelle, H., Beygelzimer, A., d’Alché-Buc, F., Fox, E., Garnett, R. (eds.) Advances in Neural Information Processing Systems, vol. 32. Curran Associates, Inc. (2019). https://proceedings.neurips.cc/paper/2019/file/56a3107cad6611c8337ee36d178ca129-Paper.pdf
Meyre, A., et al.: Concrete-ML (2022). https://github.com/zama-ai/concrete-ml
Pappalardo, A.: Xilinx/brevitas (2021). https://doi.org/10.5281/zenodo.3333552
Sejnowski, T.J.: The unreasonable effectiveness of deep learning in artificial intelligence. Proc. Natl. Acad. Sci. 117(48), 30033–30038 (2020). https://doi.org/10.1073/pnas.1907373117, https://www.pnas.org/doi/abs/10.1073/pnas.1907373117
Yvinec, E., Dapogny, A., Cord, M., Bailly, K.: SPIQ: data-free per-channel static input quantization. CoRR abs/2203.14642 (2022). https://doi.org/10.48550/arXiv.2203.14642
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Stoian, A., Frery, J., Bredehoft, R., Montero, L., Kherfallah, C., Chevallier-Mames, B. (2023). Deep Neural Networks for Encrypted Inference with TFHE. In: Dolev, S., Gudes, E., Paillier, P. (eds) Cyber Security, Cryptology, and Machine Learning. CSCML 2023. Lecture Notes in Computer Science, vol 13914. Springer, Cham. https://doi.org/10.1007/978-3-031-34671-2_34
Download citation
DOI: https://doi.org/10.1007/978-3-031-34671-2_34
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-34670-5
Online ISBN: 978-3-031-34671-2
eBook Packages: Computer ScienceComputer Science (R0)