Modular Multilayer Neural Networks Integrate Multisensory Information Near-optimally | IEEE Conference Publication | IEEE Xplore

Modular Multilayer Neural Networks Integrate Multisensory Information Near-optimally


Abstract:

Oftentimes the tasks that we regularly perform require our brain to integrate information from multiple sensory inputs. Experiments on human subjects have shown that the ...Show More

Abstract:

Oftentimes the tasks that we regularly perform require our brain to integrate information from multiple sensory inputs. Experiments on human subjects have shown that the human brain combines multiple modalities in an optimal way that is predicted by the Bayesian inference. However, there have been few studies regarding whether deep neural networks can achieve this feat. Hence, we explore the capability of a modular multilayer neural network to integrate multiple sources of information. We designed a task in which two cameras, one left and one right, are monitoring a rotating chair. Our network comprises two modules, each processing a camera input, connected by an integrator layer. We find that the network successfully combines the information from the two cameras in a nearly Bayes-optimal manner, even when it is only trained with single-camera inputs. Further works can be done to study the capacity for integrating information from more than two inputs.
Date of Conference: 14-19 July 2019
Date Added to IEEE Xplore: 30 September 2019
ISBN Information:

ISSN Information:

Conference Location: Budapest, Hungary

References

References is not available for this document.