Skip to main content

Advertisement

Log in

SmartDL: energy-aware decremental learning in a mobile-based federation for geo-spatial system

  • S.I. : Deep Geospatial Data Understanding
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

Federated learning is designed to collaboratively train a shared model based on a large number of mobile devices while preserving data privacy, which has been widely adopted to support different geo-spatial systems. However, two critical issues prevent federated learning to be effectively deployed on resource-constrained devices in large scale. First, federated learning causes high energy consumption which can badly hurt the battery lifetime of mobile devices. Second, leakage of sensitive personal information still occurs during the training process. Thus, a system that can effectively protect the sensitive information while improving the energy efficiency is urgently required for a mobile-based federated learning system. This paper proposes SmartDL, an energy-aware decremental learning framework that well balances the energy efficiency and data privacy in an efficient manner. SmartDL improves the energy efficiency from two levels: (1) global layer, which adopts an optimization approach to select a subset of participating devices with sufficient capacity and maximum reward. (2) local layer, which adopts a novel decremental learning algorithm to actively provides the decremental and incremental updates, and can adaptively tune the local DVFS at the same time. We prototyped SmartDL on physical testbed and evaluated its performance using several learning benchmarks with real-world traces. The evaluation results show that compared with the original federated learning, SmartDL can reduce energy consumption by 75.6–82.4% in different datasets. Moreover, SmartDL achieves a speedup of 2–4 orders of magnitude in model convergence while ensuring the accuracy of the model.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

Similar content being viewed by others

Notes

  1. https://hub.docker.com/r/goodlab/deal.

References

  1. Lalitha A, Shekhar S,  Javidi T, and  Koushanfar F (2018) Fully decentralized federated learning, in Third workshop on Bayesian Deep Learning (NeurIPS)

  2. Nemade B (2016) Automatic traffic surveillance using video tracking. Procedia Computer Sci 79:402–409

    Article  Google Scholar 

  3. (2020) Big data market worth $229.4 billion by 2025. [Online]. Available: https://www.marketsandmarkets.com/PressReleases/big-data.asp

  4. Li T, Sahu AK, Talwalkar A, Smith V (2020) Federated learning: challenges, methods, and future directions. IEEE Signal Process Mag 33(7):50–60

    Google Scholar 

  5. McMahan B and  Ramage D (2017) Federated learning: Collaborative machine learning without centralized training data, Google Research Blog, vol. 3

  6. Peng H, Liu G, Huang S, Yuan W, and  Lu Z (2016) Segmentation with selectively propagated constraints, in International Conference on Neural Information Processing

  7. Lu Z, Fu Z, Xiang T, Han P, Wang L, Gao X (2016) Learning from weak and noisy labels for semantic segmentation. IEEE Trans Pattern Anal Mach Intell 39(3):486–500

    Article  Google Scholar 

  8.  Niu Y,  Lu Z, Huang S,  Han P, and Wen J-R (2015) Weakly supervised matrix factorization for noisily tagged image parsing, in Twenty-Fourth International Joint Conference on Artificial Intelligence

  9. Lu Z, Han P, Wang L, Wen J (2014) Semantic sparse recoding of visual content for image applications. IEEE Trans Image Process 24(1):176–188

    MATH  Google Scholar 

  10. Zhou J, Xu Z, Zheng W, and Wang Y (2012) Capman: Cooling and active power management in big. little battery supported devices, EasyChair, Tech. Rep

  11. (2015) Dataset. [Online]. Available: https://data.world/crowdflower/ecommerce-search-relevance

  12. (2020) Gdpr.eu. recital 65: Right of rectification and erasure. [Online]. Available: https://gdpr.eu/recital-65-right-of-rectification-and-erasure

  13. Bag S, Kumar SK, Tiwari MK (2019) An efficient recommendation generation using relevant jaccard similarity. Inf Sci 483:53–64

    Article  Google Scholar 

  14. Xu Z, Li L, and  Zou W (2019) Exploring federated learning on battery-powered devices, in Proceedings of the ACM Turing Celebration Conference-China, pp. 1–6

  15. Bonawitz K,  Eichner H,  Grieskamp W,  Huba D,  Ingerman A,  Ivanov V,  Kiddon C,  Konečnỳ V et al. (2019) Towards federated learning at scale: System designhttp://arxiv.org/abs/1902.01046

  16. Schelter S (2020) Amnesia-a selection of machine learning models that can forget user data very fast. Suicide 8364(44035):46992

    Google Scholar 

  17.  Zhan Y,  Li P, Qu Z,  Zeng D, and  Guo S (2020) A learning-based incentive mechanism for federated learning, IEEE Internet of Things Journal

  18. Cauwenberghs G, Poggio T (2000) Incremental and decremental support vector machine learning. Adv Neural Inf Process Syst 13:409–415

    Google Scholar 

  19.  Pathak A, Hu YC, and  Zhang M (2012) Where is the energy spent inside my app? fine grained energy accounting on smartphones with eprof, in Proceedings of the 7th ACM european conference on Computer Systems, pp. 29–42

  20. (2020) Appendix. [Online]. Available: https://github.com/good-ncu/Appendix

  21. Li F, Liu J, Ji B (2019) Combinatorial sleeping bandits with fairness constraints. IEEE Trans Netw Sci Eng 7(3):1799–1813

    Article  Google Scholar 

  22. Gittins J, Glazebrook K, Weber R (2011) Multi-armed bandit allocation indices. Wiley, New Jersey

    Book  MATH  Google Scholar 

  23. Hou Y, Zhou P, Xu J, and Wu DO (2018) Course recommendation of mooc with big data support: A contextual online learning approach, in IEEE INFOCOM 2018-IEEE Conference on Computer Communications Workshops (INFOCOM WKSHPS). IEEE, pp. 106–111

  24. Chrobak M, Noga J (1999) Lru is better than fifo. Algorithmica 23(2):180–185

    Article  MATH  Google Scholar 

  25. Wang S,  Yang R,  Xiao X,  Wei Z, and  Yang Y (2017) Fora: simple and effective approximate single-source personalized pagerank, in Proceedings of the 23rd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 505–514

  26. Han P, Shang S, Sun A, Zhao P, Zhang X (2021) Point-of-interest recommendation with global and local context. IEEE Trans Knowl Data Eng 99:1

    Google Scholar 

  27. Han P,  Li Z,  Liu Y,  Zhao P, and  Shang S (2020) Contextualized point-of-interest recommendation, in Twenty-Ninth International Joint Conference on Artificial Intelligence and Seventeenth Pacific Rim International Conference on Artificial Intelligence IJCAI-PRICAI-20

  28. Zhang Y, Liu Y, Han P,  Miao C, and  Tang H (2020) Learning personalized itemset mapping for cross-domain recommendation, in Twenty-Ninth International Joint Conference on Artificial Intelligence and Seventeenth Pacific Rim International Conference on Artificial Intelligence IJCAI-PRICAI-20

  29. Han P, Shang S, Sun A,  Zhao P, and Kalnis P (2019), Auc-mf: Point of interest recommendation with auc maximization, in 2019 IEEE 35th International Conference on Data Engineering (ICDE),

  30.  Schelter S,  Boden C, and  Markl V (2012) Scalable similarity-based neighborhood methods with mapreduce, in Proceedings of the sixth ACM conference on Recommender systems, pp. 163–170

  31. Sarwar B, Karypis G, Konstan J, and  Riedl J (2001) Item-based collaborative filtering recommendation algorithms, in Proceedings of the 10th international conference on World Wide Web, pp. 285–295

  32. Groetsch C (1984) The theory of tikhonov regularization for fredholm equations. Boston Pitman Publication, Boston

    MATH  Google Scholar 

  33. Golub GH, Van Loan CF (2012) Matrix computations. JHU press, Maryland

    MATH  Google Scholar 

  34. Peterson LE (1883) K-nearest neighbor. Scholarpedia 4(2):2009

    Google Scholar 

  35. Gionis A, Indyk P, Motwani R et al (1999) Similarity search in high dimensions via hashing. Vldb 99(6):518–529

  36. Nicholson AC and  Gibson A (2017) Deeplearning4j: Open-source, distributed deep learning for the jvm, Deeplearning4j. org,

  37. (2020) Monsoon power monitor. [Online]. Available: http://www.msoon.com/LabEquipment/PowerMonitor/

  38. (2018) Personalized pagerank datasets. [Online]. Available: http://konect.cc/networks/

  39. (2011) Classification and regression datasets. [Online]. Available: https://www.csie.ntu.edu.tw/~cjlin/libsvmtools/datasets/

  40. Jian C (2018) Federated learning - proandroiddev. [Online]. Available: https://proandroiddev.com/federated-learning-e79e054c33ef

  41. Dinh et al. C (2021) Federated learning over wireless networks: Convergence analysis and resource allocation, IEEE/ACM Transactions on Networking, 2021

  42. Mothukuri V, Parizi RM, Pouriyeh S, Huang Y, Dehghantanha A, Srivastava G (2020) A survey on security and privacy of federated learning. Fut Gener Computer Syst 115:619–640

    Article  Google Scholar 

  43.  Ewen S, Tzoumas K,  Kaufmann M and Markl V (2012) Spinning fast iterative data flows,http://arxiv.org/abs/1208.0088

  44. Schelter S, Ewen S, Tzoumas K, and Markl V (2013) all roads lead to rome optimistic recovery for distributed iterative data processing, in Proceedings of the 22nd ACM international conference on Information & Knowledge Management, pp. 1919–1928

  45. McSherry F, Murray DG,  Isaacs R, and  Isard M (2013) Differential dataflow. in CIDR, 2013

  46. McSherry F, Lattuada A, and Schwarzkopf M (2018) K-pg: Shared state in differential dataflows

  47. Zhang H, Stafman L,  Or A, and Freedman MJ (2017) Slaq: quality-driven scheduling for distributed machine learning, in Proceedings of the 2017 Symposium on Cloud Computing, pp. 390–404

  48. Li M, Andersen DG, Park JW, et al. (2014) Scaling distributed machine learning with the parameter server, in 11th \(\{\)USENIX\(\}\) Symposium on Operating Systems Design and Implementation (\(\{\)OSDI\(\}\) 14), pp. 583–598

  49. So J, Guler B, Avestimehr AS, Mohassel P (2019) Codedprivateml: A fast and privacy-preserving framework for distributed machine learning

  50. Bao Y, Peng Y, and Wu C (2019) Deep learning-based job placement in distributed machine learning clusters, in IEEE INFOCOM 2019-IEEE conference on computer communications. IEEE, pp. 505–513

  51. Jiang J, Fu F,  Yang T, and  Cui B (2018) Sketchml: Accelerating distributed machine learning with data sketches, in Proceedings of the 2018 International Conference on Management of Data, pp. 1269–1284

  52. Kraska T, Talwalkar A, Duchi, JC,  Griffith R, Franklin MJ, and Jordan MI (2013) Mlbase: A distributed machine-learning system. in Cidr, vol. 1, pp. 2–1

  53. Mai L,  Hong C, and  Costa P (2015) Optimizing network performance in distributed machine learning, in 7th \(\{\)USENIX\(\}\) Workshop on Hot Topics in Cloud Computing (HotCloud 15)

  54. Sun S, Chen W, Bian J, Liu X, and Liu T-Y (2018) Slim-dp: a multi-agent system for communication-efficient distributed deep learning, in Proceedings of the 17th International Conference on Autonomous Agents and MultiAgent Systems, pp. 721–729

  55. Teerapittayanon S,  McDanel B, and Kung H-T (2014) Distributed deep neural networks over the cloud, the edge and end devices, in 2017 IEEE 37th International Conference on Distributed Computing Systems (ICDCS). IEEE, pp. 328–339

  56. Watcharapichat P, Morales VL, Fernandez RC, and  Pietzuch P (2016) Ako: Decentralised deep learning with partial gradient exchange, in Proceedings of the Seventh ACM Symposium on Cloud Computing, pp. 84–97

  57. Han P,  Yang P, Zhao P, Shang S, and  Kalnis P (2019) Gcn-mf: Disease-gene association identification by graph convolutional networks and matrix factorization, in the 25th ACM SIGKDD International Conference

  58.  Konečnỳ J, McMahan HB, Yu FX,  Richtárik P, Suresh AT, and  Bacon D (2016) Federated learning: Strategies for improving communication efficiency, http://arxiv.org/abs/1610.05492

  59. Smith V, Chiang C-K, Sanjabi M, Talwalkar A (2017) Federated multi-task learning. Ad Neural Inf Process Syst 32:4424–4434

    Google Scholar 

  60. McMahan B,  Moore E,  Ramage D,  Hampson S, and  Arcas BA (2017) Communication-efficient learning of deep networks from decentralized data, in Artificial Intelligence and Statistics. PMLR, pp. 1273–1282

  61. Bonawitz K, Ivanov V, Kreuter B, Marcedone A, McMahan HB, Patel S, Ramage D, Segal A, and Seth K (2017) Practical secure aggregation for privacy-preserving machine learning, in proceedings of the 2017 ACM SIGSAC Conference on Computer and Communications Security, pp. 1175–1191

  62. Sprague MR, Jalalirad A,  Scavuzzo M, Capota C,  Neun M,  Do L, and  Kopp M (2018) Asynchronous federated learning for geospatial applications, in Joint European Conference on Machine Learning and Knowledge Discovery in Databases. Springer, pp. 21–28

  63. Mo F and Haddadi H (2019) Efficient and private federated learning using tee, in EuroSys

  64. Wang S, Tuor T, Salonidis T, Leung KK, Makaya C, He T, Chan K (2019) Adaptive federated learning in resource constrained edge computing systems. IEEE J Selected Areas Communi 37(6):1205–1221

    Article  Google Scholar 

  65. Wu C-J, Brooks D, Chen K, Chen D, Choudhury S, et al. (2019) Machine learning at facebook: Understanding inference at the edge, in 2019 IEEE International Symposium on High Performance Computer Architecture (HPCA). IEEE, pp. 331–344

Download references

Acknowledgements

This work is supported by National Key R&D Program of China No. 2018YFB1404303 and ICT Grant CARCHB202017. We also acknowledge the editorial committee’s support and all anonymous reviewers for their insightful comments and suggestions, which improved the content and presentation of this paper.

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Zichen Xu or Dan Wu.

Ethics declarations

Conflict of interest

We have no conflict of interest regarding to this original manuscript.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Wenting Zou is a visiting student in SIAT.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zou, W., Li, L., Xu, Z. et al. SmartDL: energy-aware decremental learning in a mobile-based federation for geo-spatial system. Neural Comput & Applic 35, 3677–3696 (2023). https://doi.org/10.1007/s00521-021-06378-9

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-021-06378-9

Keywords

Navigation