Skip to main content

Client Selection Based on Diversity Scaling for Federated Learning on Non-IID Data

  • Conference paper
  • First Online:
Broadband Communications, Networks, and Systems (BROADNETS 2023)

Abstract

In a wireless Federated Learning (FL) system, clients train their local models over local datasets on IoT devices. The derived local models are uploaded to the FL server which generates a global model, then broadcasts the model back to the clients for further training. Due to the heterogeneous feature of clients, client selection plays an important role in determining the overall training time. Traditionally, maximum number of clients are selected if they can derive and upload their local models before the deadline in each global iteration. However, selecting more clients not only increases the energy consumption of the clients, but also might not be necessary as having fewer clients in early global iterations and more clients in later iterations have been proved better for model accuracy. To address the issue, this paper proposes a client selection scheme which dynamically adjusts and optimizes the trade-off between maximizing the number of selected clients and minimizing the total communication cost between the clients and the server. By comparing the data diversity of clients, this scheme can select the most suitable clients for global convergence. A Diversity Scaling Node Selection framework (FedDS) is implemented to dynamically change the selecting weights of each node based on the degree of non-i.i.d data diversity. Results has shown that the proposed FedDS can speed up the FL convergence rate compared to FedAvg with random node selection.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 49.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 64.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Chiang, M., Zhang, T.: Fog and IoT: an overview of research opportunities. IEEE Internet Things J. 3, 854–864 (2016)

    Article  Google Scholar 

  2. Xiong, Z., Zhang, Y., Niyato, D., Wang, P., Han, Z.: When mobile blockchain meets edge computing. IEEE Commun. Mag. 56, 33–39 (2018)

    Article  Google Scholar 

  3. McMahan, H.B., Moore, E., Ramage, D., Hampson, S., y Arcas, B.A.: Communication-efficient learning of deep networks from decentralized data. In: Proceedings of 20th International Conference on Artificial Intelligence and Statistics, pp. 1273–1282. PMLR (2017)

    Google Scholar 

  4. Li, T., Sahu, A.K., Zaheer, M., Sanjabi, M., Talwalkar, A., Smith, V.: Federated optimization in heterogeneous networks. arXiv:1812.06127 [cs, stat] (2020)

  5. Zhao, Y., Li, M., Lai, L., Suda, N., Civin, D., Chandra, V.: Federated learning with non-IID data. arXiv:1806.00582 (2018)

  6. Wang, H., Kaplan, Z., Niu, D., Li, B.: Optimizing federated learning on Non-IID data with reinforcement learning. In: IEEE INFOCOM 2020 - IEEE Conference on Computer Communications, pp. 1698–1707 (2020)

    Google Scholar 

  7. Wang, S., et al.: Adaptive federated learning in resource constrained edge computing systems. arXiv:1804.05271 [cs, math, stat] (2019)

  8. Wang, L., Wang, W., Li, B.: CMFL: mitigating communication overhead for federated learning. In: 2019 IEEE 39th International Conference on Distributed Computing Systems (ICDCS), pp. 954–964 (2019)

    Google Scholar 

  9. Nishio, T., Yonetani, R.: Client selection for federated learning with heterogeneous resources in mobile edge. In: Proceedings of the IEEE International Conference on Communications (ICC) (2019)

    Google Scholar 

  10. Amiri, M.M., Gündüz, D., Kulkarni, S.R., Poor, H.V.: Convergence of update aware device scheduling for federated learning at the wireless edge. IEEE Trans. Wireless Commun. 20, 3643–3658 (2021)

    Article  Google Scholar 

  11. Cho, Y.J., Wang, J., Joshi, G.: Client selection in federated learning: convergence analysis and power-of-choice selection strategies. arXiv:2010.01243 (2020)

  12. Chen, M., Shlezinger, N., Poor, H.V., Eldar, Y.C., Cui, S.: Communication-efficient federated learning. Proc. Natl. Acad. Sci. U. S. A. 118 (2021). https://doi.org/10.1073/pnas.2024789118

  13. Chen, M., Poor, H.V., Saad, W., Cui, S.: Convergence time optimization for federated learning over wireless networks. IEEE Trans. Wireless Commun. 20, 2457–2471 (2021)

    Article  Google Scholar 

  14. Ren, J., He, Y., Wen, D., Yu, G., Huang, K., Guo, D.: Scheduling for cellular federated edge learning with importance and channel awareness. arXiv:2004.00490 (2020)

  15. Chen, W., Horvath, S., Richtarik, P.: Optimal client sampling for federated learning. arXiv:2010.13723 (2020)

  16. Rizk, E., Vlaski, S., Sayed, A.H.: Optimal importance sampling for federated learning. In: ICASSP 2021–2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pp. 3095–3099 (2021)

    Google Scholar 

  17. Li, X., Huang, K., Yang, W., Wang, S., Zhang, Z.: On the convergence of FedAvg on non-IID data. In: Proceedings of the International Conference on Learning Representations (ICLR) (2020)

    Google Scholar 

  18. Wu, H., Wang, P.: Fast-convergent federated learning with adaptive weighting. IEEE Trans. Cogn. Commun. Netw. 7, 1078–1088 (2021)

    Article  Google Scholar 

  19. Stich, S.U.: Local SGD converges fast and communicates little. In: Proceedings of the International Conference on Learning Representations (ICLR) (2019)

    Google Scholar 

  20. Yin, D., Pananjady, A., Lam, M., Papailiopoulos, D., Ramchandran, K., Bartlett, P.: Gradient diversity: a key ingredient for scalable distributed learning. In: Storkey, A. and Perez-Cruz, F. (eds.) Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics, pp. 1998–2007. PMLR (2018)

    Google Scholar 

  21. Lecun, Y., Bottou, L., Bengio, Y., Haffner, P.: Gradient-based learning applied to document recognition. Proc. IEEE 86, 2278–2324 (1998)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yuechao Ren .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 ICST Institute for Computer Sciences, Social Informatics and Telecommunications Engineering

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Ren, Y., Sajjanhar, A., Gao, S., Loke, S. (2023). Client Selection Based on Diversity Scaling for Federated Learning on Non-IID Data. In: Wang, W., Wu, J. (eds) Broadband Communications, Networks, and Systems. BROADNETS 2023. Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering, vol 511. Springer, Cham. https://doi.org/10.1007/978-3-031-40467-2_8

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-40467-2_8

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-40466-5

  • Online ISBN: 978-3-031-40467-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics