Skip to main content

Swap Kernel Regression

  • Conference paper
  • First Online:

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 11727))

Abstract

Recent developments in the field of artificial intelligence have increased the demand for high performance computation devices. An edge device is highly restricted not only in terms of its computational power but also memory capacity. This study proposes a method that enables both inference and learning on an edge device. The proposed method involves a kernel machine that works in restricted environments by collaborating with its secondary storage system. The kernel parameters, which are not essential for calculating the output values for the upcoming inputs, are stored in the secondary storage to make space in the main memory. The essential kernel parameters stored in the secondary storage are loaded into the main memory when required. With the use of this strategy, the system can realize the recognition/regression tasks without reducing its generalization capability.

A part of this research was supported by the Chubu University Grant A.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Notes

  1. 1.

    The original LGRNN algorithm has four learning options (Yamauchi (2014)). One of them being the projection operation. However this operation is not useful for the swap kernel regression. Hence, here we only use the replacement and ignore options.

  2. 2.

    Even under such situations, if the related instance distribution area is wide enough, then the number of kernels of the same cluster is larger than \(|S_t^1|\), hence the swap occurs repeatedly (see Algorithm 2).

  3. 3.

    Although the VGG16 model supports the processing of real color images, we still provide the re-scaled gray-level images as the input.

  4. 4.

    Note that each new instance was tested its recognition result before the incremental learning of it, and the swap-kerne-regression learned it incrementally. So, if the swap kernel regression fails to recognize a new instance, the cumulative number of mistakes is incremented.

References

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Koichiro Yamauchi .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Yamamoto, M., Yamauchi, K. (2019). Swap Kernel Regression. In: Tetko, I., Kůrková, V., Karpov, P., Theis, F. (eds) Artificial Neural Networks and Machine Learning – ICANN 2019: Theoretical Neural Computation. ICANN 2019. Lecture Notes in Computer Science(), vol 11727. Springer, Cham. https://doi.org/10.1007/978-3-030-30487-4_18

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-30487-4_18

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-30486-7

  • Online ISBN: 978-3-030-30487-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics