skip to main content
10.1145/3408308.3427977acmotherconferencesArticle/Chapter ViewAbstractPublication PagesbuildsysConference Proceedingsconference-collections
research-article

EdgeNILM: Towards NILM on Edge devices

Published: 18 November 2020 Publication History

Abstract

Non-intrusive load monitoring (NILM) or energy disaggregation refers to the task of estimating the appliance power consumption given the aggregate power consumption readings. Recent state-of-the-art neural networks based methods are computation and memory intensive, and thus not suitable to run on "edge devices". Recent research has proposed various methods to compress neural networks without significantly impacting accuracy. In this work, we study different neural network compression schemes and their efficacy on the state-of-the-art neural network NILM method. We additionally propose a multi-task learning-based architecture to compress models further. We perform an extensive evaluation of these techniques on two publicly available datasets and find that we can reduce the memory and compute footprint by a factor of up to 100 without significantly impacting predictive performance.

References

[1]
Sajid Anwar, Kyuyeon Hwang, and Wonyong Sung. 2017. Structured pruning of deep convolutional neural networks. ACM Journal on Emerging Technologies in Computing Systems (JETC) 13, 3 (2017), 1--18.
[2]
K Carrie Armel, Abhay Gupta, Gireesh Shrimali, and Adrian Albert. 2013. Is disaggregation the holy grail of energy efficiency? The case of electricity. Energy Policy 52 (2013), 213--234.
[3]
Nipun Batra, Jack Kelly, Oliver Parson, Haimonti Dutta, William Knottenbelt, Alex Rogers, Amarjeet Singh, and Mani Srivastava. 2014. NILMTK: an open source toolkit for non-intrusive load monitoring. In Proceedings of the 5th international conference on Future energy systems. 265--276.
[4]
Nipun Batra, Rithwik Kukunuri, Ayush Pandey, Raktim Malakar, Rajat Kumar, Odysseas Krystalakos, Mingjun Zhong, Paulo Meira, and Oliver Parson. 2019. Towards reproducible state-of-the-art energy disaggregation. In Proceedings of the 6th ACM International Conference on Systems for Energy-Efficient Buildings, Cities, and Transportation. 193--202.
[5]
Christian Beckel, Leyna Sadamori, Thorsten Staake, and Silvia Santini. 2014. Revealing household characteristics from smart meter data. Energy 78 (2014), 397--410.
[6]
Davis Blalock, Jose Javier Gonzalez Ortiz, Jonathan Frankle, and John Guttag. 2020. What is the state of neural network pruning? arXiv preprint arXiv:2003.03033 (2020).
[7]
Richard Caruana. 1993. Multitask Learning: A Knowledge-Based Source of Inductive Bias. In Proceedings of the Tenth International Conference on Machine Learning. Morgan Kaufmann, 41--48.
[8]
Rich Caruana. 1997. Multitask Learning. Machine Learning, 28, 41--75, 10.1023/A:1007379606734 (1997).
[9]
Dennis, Don Kurian and Gopinath, Sridhar and Gupta, Chirag and Kumar, Ashish and Kusupati, Aditya and Patil, Shishir G and Simhadri, Harsha Vardhan. [n.d.]. EdgeML: Machine Learning for resource-constrained edge devices. https://github.com/Microsoft/EdgeML
[10]
Michele D'Incecco, Stefano Squartini, and Mingjun Zhong. 2019. Transfer learning for non-intrusive load monitoring. IEEE Transactions on Smart Grid 11, 2 (2019), 1419--1429.
[11]
Song Han, Huizi Mao, and William J Dally. 2015. Deep compression: Compressing deep neural networks with pruning, trained quantization and huffman coding. arXiv preprint arXiv:1510.00149 (2015).
[12]
George William Hart. 1992. Nonintrusive appliance load monitoring. Proc. IEEE 80, 12 (1992), 1870--1891.
[13]
Kanghang He, Lina Stankovic, Jing Liao, and Vladimir Stankovic. 2016. Non-intrusive load disaggregation using graph signal processing. IEEE Transactions on Smart Grid 9, 3 (2016), 1739--1747.
[14]
Yiling Jia, Nipun Batra, Hongning Wang, and Kamin Whitehouse. 2019. A tree-structured neural network model for household energy breakdown. In The World Wide Web Conference. 2872--2878.
[15]
Jack Kelly, Nipun Batra, Oliver Parson, Haimonti Dutta, William Knottenbelt, Alex Rogers, Amarjeet Singh, and Mani Srivastava. 2014. Nilmtk v0. 2: a non-intrusive load monitoring toolkit for large scale data sets: demo abstract. In Proceedings of the 1st ACM Conference on Embedded Systems for Energy-efficient Buildings. 182--183.
[16]
Jack Kelly and William Knottenbelt. 2015. Neural nilm: Deep neural networks applied to energy disaggregation. In Proceedings of the 2nd ACM International Conference on Embedded Systems for Energy-Efficient Built Environments. 55--64.
[17]
Jack Kelly and William Knottenbelt. 2015. The UK-DALE dataset, domestic appliance-level electricity demand and whole-house demand from five UK homes. Scientific data 2, 1 (2015), 1--14.
[18]
J Zico Kolter, Siddharth Batra, and Andrew Y Ng. 2010. Energy disaggregation via discriminative sparse coding. In Advances in Neural Information Processing Systems. 1153--1161.
[19]
J Zico Kolter and Tommi Jaakkola. 2012. Approximate inference in additive factorial hmms with application to energy disaggregation. In Artificial intelligence and statistics. 1472--1482.
[20]
J Zico Kolter and Matthew J Johnson. 2011. REDD: A public data set for energy disaggregation research. In Workshop on data mining applications in sustainability (SIGKDD), San Diego, CA, Vol. 25. 59--62.
[21]
Odysseas Krystalakos, Christoforos Nalmpantis, and Dimitris Vrakas. 2018. Sliding window approach for online energy disaggregation using artificial neural networks. In Proceedings of the 10th Hellenic Conference on Artificial Intelligence.
[22]
Vadim Lebedev, Yaroslav Ganin, Maksim Rakhuba, Ivan Oseledets, and Victor Lempitsky. 2014. Speeding-up convolutional neural networks using fine-tuned cp-decomposition. arXiv preprint arXiv:1412.6553 (2014).
[23]
Yann LeCun, John S Denker, and Sara A Solla. 1990. Optimal brain damage. In Advances in neural information processing systems. 598--605.
[24]
Hao Li, Asim Kadav, Igor Durdanovic, Hanan Samet, and Hans Peter Graf. 2016. Pruning filters for efficient convnets. arXiv preprint arXiv:1608.08710 (2016).
[25]
Steven J Nowlan and Geoffrey E Hinton. 1992. Simplifying neural networks by soft weight-sharing. Neural computation 4, 4 (1992), 473--493.
[26]
Sanghyun Son, Seungjun Nah, and Kyoung Mu Lee. 2018. Clustering Convolutional Kernels to Compress Deep Neural Networks. In Proceedings of the European Conference on Computer Vision (ECCV).
[27]
Taiji Suzuki, Hiroshi Abe, Tomoya Murata, Shingo Horiuchi, Kotaro Ito, Tokuma Wachi, So Hirai, Masatoshi Yukishima, and Tomoaki Nishimura. 2018. Spectral-Pruning: Compressing deep neural network via spectral analysis. arXiv preprint arXiv:1808.08558 (2018).
[28]
Karen Ullrich, Edward Meeds, and Max Welling. 2017. Soft Weight-Sharing for Neural Network Compression. (02 2017).
[29]
Chaoyun Zhang, Mingjun Zhong, Zongzuo Wang, Nigel Goddard, and Charles Sutton. 2018. Sequence-to-point learning with neural networks for non-intrusive load monitoring. In Thirty-second AAAI conference on artificial intelligence.
[30]
Aojun Zhou, Anbang Yao, Yiwen Guo, Lin Xu, and Yurong Chen. 2017. Incremental network quantization: Towards lossless cnns with low-precision weights. arXiv preprint arXiv:1702.03044 (2017).

Cited By

View all
  • (2025)Efficient Energy Disaggregation via Residual Learning-Based Depthwise Separable Convolutions and Segmented InferenceIEEE Transactions on Industrial Informatics10.1109/TII.2024.349577121:3(2224-2233)Online publication date: Mar-2025
  • (2024)Knowledge Distillation for Scalable Nonintrusive Load MonitoringIEEE Transactions on Industrial Informatics10.1109/TII.2023.332843620:3(4710-4721)Online publication date: Mar-2024
  • (2024)OPT-NILM: An Iterative Prior-to-Full-Training Pruning Approach for Cost-Effective User Side Energy DisaggregationIEEE Transactions on Consumer Electronics10.1109/TCE.2023.332449370:1(4435-4446)Online publication date: Feb-2024
  • Show More Cited By

Index Terms

  1. EdgeNILM: Towards NILM on Edge devices

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    BuildSys '20: Proceedings of the 7th ACM International Conference on Systems for Energy-Efficient Buildings, Cities, and Transportation
    November 2020
    361 pages
    ISBN:9781450380614
    DOI:10.1145/3408308
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 18 November 2020

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. Edge computing
    2. Neural networks
    3. Non-Intrusive Load Monitoring

    Qualifiers

    • Research-article
    • Research
    • Refereed limited

    Conference

    BuildSys '20
    Sponsor:

    Acceptance Rates

    BuildSys '20 Paper Acceptance Rate 38 of 139 submissions, 27%;
    Overall Acceptance Rate 148 of 500 submissions, 30%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)92
    • Downloads (Last 6 weeks)8
    Reflects downloads up to 05 Mar 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2025)Efficient Energy Disaggregation via Residual Learning-Based Depthwise Separable Convolutions and Segmented InferenceIEEE Transactions on Industrial Informatics10.1109/TII.2024.349577121:3(2224-2233)Online publication date: Mar-2025
    • (2024)Knowledge Distillation for Scalable Nonintrusive Load MonitoringIEEE Transactions on Industrial Informatics10.1109/TII.2023.332843620:3(4710-4721)Online publication date: Mar-2024
    • (2024)OPT-NILM: An Iterative Prior-to-Full-Training Pruning Approach for Cost-Effective User Side Energy DisaggregationIEEE Transactions on Consumer Electronics10.1109/TCE.2023.332449370:1(4435-4446)Online publication date: Feb-2024
    • (2024)Towards edge-computed NILM: Insights from a Mediterranean Use Case2024 3rd International Conference on Energy Transition in the Mediterranean Area (SyNERGY MED)10.1109/SyNERGYMED62435.2024.10799297(1-5)Online publication date: 21-Oct-2024
    • (2024)A Pre-Training Pruning Strategy for Enabling Lightweight Non-Intrusive Load Monitoring On Edge Devices2024 IEEE International Conference on Acoustics, Speech, and Signal Processing Workshops (ICASSPW)10.1109/ICASSPW62465.2024.10626463(249-253)Online publication date: 14-Apr-2024
    • (2024)Advancing Sustainable IoT Appliance Load Monitoring Through Edge-Enabled Federated Transfer Learning2024 International Conference on Green Energy, Computing and Sustainable Technology (GECOST)10.1109/GECOST60902.2024.10474899(386-391)Online publication date: 17-Jan-2024
    • (2024) DP -NILM: A distributed and privacy-preserving framework for non-intrusive load monitoring Renewable and Sustainable Energy Reviews10.1016/j.rser.2023.114091191(114091)Online publication date: Mar-2024
    • (2024)A review of current methods and challenges of advanced deep learning-based non-intrusive load monitoring (NILM) in residential contextEnergy and Buildings10.1016/j.enbuild.2024.113890305(113890)Online publication date: Feb-2024
    • (2023)A New NILM System Based on the SFRA Technique and Machine LearningSensors10.3390/s2311522623:11(5226)Online publication date: 31-May-2023
    • (2023)Variational Regression for Multi-Target Energy DisaggregationSensors10.3390/s2304205123:4(2051)Online publication date: 11-Feb-2023
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media