Fast inference services for alternative deep learning structures
Abstract
References
Recommendations
Fast Training of Deep LSTM Networks
Advances in Neural Networks – ISNN 2019AbstractDeep recurrent neural networks (RNN), such as LSTM, have many advantages over forward networks. However, the LSTM training method, such as backward propagation through time (BPTT), is really slow.
In this paper, by separating the LSTM cell into ...
Deep active inference
This work combines the free energy principle and the ensuing active inference dynamics with recent advances in variational inference in deep generative models, and evolution strategies to introduce the "deep active inference" agent. This agent minimises ...
Collapsed inference for Bayesian deep learning
NIPS '23: Proceedings of the 37th International Conference on Neural Information Processing SystemsBayesian neural networks (BNNs) provide a formalism to quantify and calibrate uncertainty in deep learning. Current inference approaches for BNNs often resort to few-sample estimation for scalability, which can harm predictive performance, while its ...
Comments
Information & Contributors
Information
Published In
![cover image ACM Conferences](/cms/asset/21eb9eea-be2d-46f0-a359-e798d0e86868/3318216.cover.jpg)
- General Chairs:
- Songqing Chen,
- Ryokichi Onishi,
- Program Chairs:
- Ganesh Ananthanarayanan,
- Qun Li
Sponsors
In-Cooperation
- IEEE-CS\DATC: IEEE Computer Society
Publisher
Association for Computing Machinery
New York, NY, United States
Publication History
Check for updates
Qualifiers
- Poster
Conference
Acceptance Rates
Contributors
Other Metrics
Bibliometrics & Citations
Bibliometrics
Article Metrics
- 0Total Citations
- 160Total Downloads
- Downloads (Last 12 months)4
- Downloads (Last 6 weeks)1
Other Metrics
Citations
View Options
Login options
Check if you have access through your login credentials or your institution to get full access on this article.
Sign in