skip to main content
10.1145/3494885.3494923acmotherconferencesArticle/Chapter ViewAbstractPublication PagescsseConference Proceedingsconference-collections
research-article

Study on linear optimization of activation function of homomorphic encryption neural network

Published:20 December 2021Publication History

ABSTRACT

Since the 20th century, the research on cipher neural network is more and more in-depth. Encryption scheme based on neural network is from layered homomorphic encryption scheme to fully homomorphic encryption scheme, the accuracy of neural network and the resource consumption in calculation are always the core of the research. This paper mainly studies the optimization of the excitation layer in the neural network. Since the convolution layer and pooling layer in the hidden layer are linear operations, they can support the secret state operation, and the excitation layer introduces nonlinear factors into the neural network in the form of nonlinear functions, so the secret state calculation cannot be directly applied to the layer. In order to solve this problem, this paper on the basis of CryptoNets model, only in the case of low degree to consider about the commonly used three kinds of activation function : Sigmoid, Tanh and ReLU function for the linear approximation, and proposed a new approximate method : An approximate function is constructed by using the structural characteristics of the derivative of the function, compared with the commonly used approximate method, a new method of approximate function replacement after activation function, the neural network prediction accuracy and operation time in both performance is more outstanding.

References

  1. Dowlin N, Gilad-Bachrach R, Laine K, CryptoNets: Applying Neural Networks to Encrypted Data with High Throughput and Accuracy[R].IEEE, 2016.Google ScholarGoogle Scholar
  2. F Bourse, Minelli M, Minihold M, Fast Homomorphic Evaluation of Deep Discretized Neural Networks[C].2017, 483-512.Google ScholarGoogle Scholar
  3. Hesamifard E, Takabi H, Ghasemi M .CryptoDL: Deep Neural Networks over Encrypted Data[R].2017.Google ScholarGoogle Scholar
  4. Badawi A A, Chao J, Jie L, The AlexNet Moment for Homomorphic Encryption: HCNN,the First Homomorphic CNN on Encrypted Data with GPUs[J].IEEE Transactions on Emerging Topics in Computing, 2020, PP(99):1-1Google ScholarGoogle Scholar
  5. Liu Xiaowen, Guo Dabo, Li Cong.An improvement of the activation function in convolutional neural networks [J].Test Technology, 2019,33 (02): 121-125Google ScholarGoogle Scholar
  6. Lai Ce.Analysis of the activation function in convolutional neural networks [J].Science and Technology Innovation, 2019,000 (033): 35-36Google ScholarGoogle Scholar
  7. Wang Shuangyin, Teng Guowen.Design of activation function optimization in convolutional neural networks [J].ICT, 2018,000 (001): 42-43Google ScholarGoogle Scholar
  8. Zhou Feiyan, Jin Linpeng, Dong Jun.Review of convolutional neural network studies [J].Journal of Computer Science, 2017,000 (6): 1-1Google ScholarGoogle Scholar
  9. Jiang Onbo, Wang Weiwei.Activation function optimization study [J].Sensor with Microsystems, 2018,02 (v.37;No.312):56-58Google ScholarGoogle Scholar
  10. Tian Juan, Li Yingxiang, Li Hongyan.Comparative study of the activation function in convolutional neural networks [J].Computer Systems Applications, 2018, v.27(07):45-51Google ScholarGoogle Scholar
  11. Qu Zhilin, Hu Xiaofei.Study on Convolutional Neural Network Based on improved activation functions [J].Computer Technology and Development, 2017,27 (012): 77-80Google ScholarGoogle Scholar

Recommendations

Comments

Login options

Check if you have access through your login credentials or your institution to get full access on this article.

Sign in
  • Published in

    cover image ACM Other conferences
    CSSE '21: Proceedings of the 4th International Conference on Computer Science and Software Engineering
    October 2021
    366 pages
    ISBN:9781450390675
    DOI:10.1145/3494885

    Copyright © 2021 ACM

    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    • Published: 20 December 2021

    Permissions

    Request permissions about this article.

    Request Permissions

    Check for updates

    Qualifiers

    • research-article
    • Research
    • Refereed limited

    Acceptance Rates

    Overall Acceptance Rate33of74submissions,45%
  • Article Metrics

    • Downloads (Last 12 months)20
    • Downloads (Last 6 weeks)4

    Other Metrics

PDF Format

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format .

View HTML Format