Abstract:
For highly distributed environments such as edge computing, collaborative learning approaches eschew the dependence on a global, shared model, in favor of models tailored...Show MoreMetadata
Abstract:
For highly distributed environments such as edge computing, collaborative learning approaches eschew the dependence on a global, shared model, in favor of models tailored for each location. Creating tailored models for individual learning contexts reduces the amount of data transfer, while collaboration among peers provides acceptable model performance. Collaboration assumes, however, the availability of knowledge transfer mechanisms, which are not trivial for deep learning models where knowledge isn't easily attributed to precise model slices. We present CLUE – a framework that facilitates knowledge transfer for neural networks. CLUE provides new system support for dynamically extracting significant parameters from a helper node's neural network, and uses this with a multi-model boosting-based approach to improve the predictive performance of the target node. The evaluation of CLUE with different PyTorch and TensorFlow neural network models demonstrates that its knowledge transfer mechanism improves by up to 3.5\times how quickly a model adapts to changes, compared to learning in isolation, while affording up to several magnitudes reduction in data movement costs compared to federated learning.
Published in: IEEE Transactions on Cloud Computing ( Volume: 11, Issue: 4, Oct.-Dec. 2023)