Skip to main content

FedPV-FS: A Feature Selection Method for Federated Learning in Insurance Precision Marketing

  • Conference paper
  • First Online:
Intelligent Information Processing XII (IIP 2024)

Part of the book series: IFIP Advances in Information and Communication Technology ((IFIPAICT,volume 703))

Included in the following conference series:

  • 52 Accesses

Abstract

Insurance companies always use federated learning to integrate external data sources for data analysis and improve the conversion rate of insurance precision marketing. However, due to imbalanced data distribution and the presence of null data, the joint modeling often suffers from low robustness and is prone to falling into the dilemma of under-fitting. Therefore, the feature selection for federated learning needs to be incorporated before the joint modeling to improve the accuracy of predictions. In this paper, we propose the FedPV-FS method, which includes two-party feature selection based on public verifiable covert (PVC), and multi-party federated feature selection based on verifiable secret sharing (VSS). Moreover, we iteratively optimize federated feature selection using data selection, transformation, and integration. Experiments show that our method can achieve high-quality feature selection for increasing the optimization objective to 88.4%, promote the continuous increase of insurance premiums, and has good applications in insurance precision marketing scenarios.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

References

  1. McMahan, H.B., Moore, E., Ramage, D., et al.: Communication-efficient learning of deep networks from decentralized data. In: Artificial Intelligence and Statistics, pp. 1273–1282. PMLR (2017)

    Google Scholar 

  2. Yang, Q., Liu, Y., Cheng, Y., et al.: Federated learning. Synth. Lect. Artif. Intell. Mach. Learn. 13(3), 1–207 (2019)

    Google Scholar 

  3. Wang, J., Zhang, A., Li, X., et al.: Efficient participant contribution evaluation for horizontal and vertical federated learning. In: 2022 IEEE 38th International Conference on Data Engineering (ICDE), pp. 911–923 (2022)

    Google Scholar 

  4. Chandrashekar, G., Sahin, F.: A survey on feature selection methods. Comput. Electr. Eng. 40(1), 16–28 (2014)

    Article  Google Scholar 

  5. Pan, F., Meng, D., Zhang, Y., et al.: Secure federated feature selection for cross-feature federated learning (2020)

    Google Scholar 

  6. Yao, A.: Protocols for secure computations. In: 23rd Annual Symposium on Foundations of Computer Science, pp. 160–164. IEEE Computer Society, Chicago, Illinois, USA (1982)

    Google Scholar 

  7. Yang, Z., Sun, Q.: Joint think locally and globally: communication-efficient federated learning with feature-aligned filter selection. Comput. Commun. (2023)

    Google Scholar 

  8. Mahanipour, A., Khamfroush, H.: Wrapper-based federated feature selection for iot environments. In: 2023 International Conference on Computing, Networking and Communications (ICNC), Honolulu, HI, USA, pp. 214–219 (2023)

    Google Scholar 

  9. Chen, P., Du, X., Lu, Z., et al.: EVFL: an explainable vertical federated learning for data-oriented artificial intelligence systems. J. Syst. Archit. 126, 102474 (2022)

    Article  Google Scholar 

  10. Feng, S.: Vertical federated learning-based feature selection with non-overlapping sample utilization. Expert Syst. Appl. (2022)

    Google Scholar 

  11. Li, A., Peng, H., Zhang, L., et al.: edSDG-FS: efficient and secure feature selection for vertical federated learning. In: IEEE International Conference on Computer Communication (2023)

    Google Scholar 

  12. Louizos, C., Welling, M., Kingma, D.: Learning sparse neural networks through l0 regularization. arXiv:1712.01312 (2018)

  13. Hong, C., Katz, J., Kolesnikov, V., Lu, W., Wang, X.: Covert security with public verifiability: faster, leaner, and simpler. In: Ishai, Y., Rijmen, V. (eds.) EUROCRYPT 2019. LNCS, vol. 11478, pp. 97–121. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-17659-4_4

    Chapter  Google Scholar 

  14. Feldman, P.: A practical scheme for non-interactive verifiable secret sharing. In: 28th Annual Symposium on Foundations of Computer Science, Los Angeles, CA, USA, pp. 427–438 (1987)

    Google Scholar 

  15. Even, S., Goldreich, O., Lempel, A.: A randomized protocol for signing contracts. Commun. ACM, 637–647 (1985)

    Google Scholar 

  16. FATE Homepage. https://github.com/FederatedAI/FATE. Accessed 30 Nov 2023

  17. J. Thomas. Mass spectrometric data. https://www.openml.org/d/41157

  18. Dua, D., Graff, C.: UCI machine learning repository (2017)

    Google Scholar 

  19. Cheng, K., Fan, T., Jin, Y., et al.:SecureBoost: a lossless federated learning framework. arXiv (2019)

    Google Scholar 

  20. Chen, T., Guestrin, C.: XGBoost: a scalable tree boosting system. CoRR abs/1603.02754 (2016)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Chunkai Wang .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2024 IFIP International Federation for Information Processing

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Wang, C., Feng, J. (2024). FedPV-FS: A Feature Selection Method for Federated Learning in Insurance Precision Marketing. In: Shi, Z., Torresen, J., Yang, S. (eds) Intelligent Information Processing XII. IIP 2024. IFIP Advances in Information and Communication Technology, vol 703. Springer, Cham. https://doi.org/10.1007/978-3-031-57808-3_31

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-57808-3_31

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-57807-6

  • Online ISBN: 978-3-031-57808-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics