Skip to main content

Advertisement

Log in

Study on modeling implicit learning based on MAM framework

  • Published:
Artificial Intelligence Review Aims and scope Submit manuscript

Abstract

Implicit Learning (IL) involves the fundamental problem of human potential development, and it has been a hot and difficult topic for many years. Traditional artificial neural networks can simulate IL, but there are some shortcomings. A few years ago, people used a morphological neural network (MNN) to simulate IL, but the support in theory and practice is weak. The contribution of this study is threefold. Firstly, based on the theory of unified framework of morphological associative memories (UFMAM), this paper makes a deep exploration for simulating IL by MNNs. Since both MNN and UFMAM are based on strict mathematical morphology, the research is established on a solid theoretical basis. Secondly, three experiments were designed, and the results were analyzed and discussed according to the theory of UFMAM. Thus, the depth and breadth of this research of IL were further expanded, new simulation methods and research examples were provided, and the MNN model of IL was established. Thirdly, it provides an example for the coordinated development of artificial neural networks, artificial intelligence, cognitive psychology, neural science and brain science. The research shows that the IL model based on MNN is superior to the traditional IL model in automation, comprehension, abstraction and anti-interference. Therefore, it will play an important role in the future study of IL and bring new inspiration to reveal the neural mechanism of IL. There is an inseparable relationship between MNN and IL, i.e. the former provides new research tools and means for the latter, while the latter provides psychological and neuroscientific supports for the former, which will make both of them have a more solid scientific foundation. It is reasonable to believe that computer simulation of IL and other cognitive phenomena will have an important impact on promoting the coordinated development of multidisciplinary.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

References

Download references

Acknowledgements

This work was supported by Henan Province's key R&D project under Grant 192102310217, the science and technology research project of Zhengzhou city under Grant 153PKJGG153 and the Key Research Project of Zhengzhou University of Industrial Technology under Grant JG-190101.

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Naiqin Feng, Xiuqin Geng or Bin Sun.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendix

Appendix

1.1 Proofs of theorems

In the proofs of the Theorems 3 and 4, we consider the most complicated case. Set Ο = Log, Θ = Exp. For convenience, in the logarithmic function y = loga x, we assume that a > 1 and x > 1. In other cases, similar analysis and proof can be done. We only prove theorems in one domain for either the memory VXY or the memory TXY. Having proved results for one memory the result for the other memory can be derived in an analogous fashion by replacing minimums with maximums, or vice versa. Under the assumption, we give the new forms of the two theorems as follows:

Theorem 3′

Let \({\tilde{\mathbf{x}}}^{l}\) denote a distorted version of \({\mathbf{x}}^{l}\). Then VXY \(\mathop \vee \limits^{\exp }\)\({\tilde{\mathbf{x}}}^{l}\) = \( {\mathbf{y}}^{l} \), if and only if.

$$ \widetilde{x}_{j}^{l} \le x_{j}^{l} \vee \mathop \wedge \limits_{i = 1}^{m} \left( {\mathop \vee \limits_{\xi \ne l} \mathop {\left( {x_{{j_{i} }}^{\xi } } \right)}\nolimits^{{\log_{{y_{i}^{\xi } }} y_{i}^{l} }} } \right) \, \forall j = 1, \ldots ,n $$
(27)

and for each row index i ∈ {1, …, m}, there is a column index ji ∈ {1, …, n} such that

$$\widetilde{x}_{{j_{i} }}^{l} = x_{{j_{i} }}^{l} \vee \left( {\mathop \vee \limits_{\xi \ne l} \mathop {\left( {x_{{j_{i} }}^{\xi } } \right)}\nolimits^{{\log_{{y_{i}^{\xi } }} y_{i}^{l} }} } \right) \, $$
(28)

Theorem 4′

Let \({\tilde{\mathbf{x}}}^{l}\) denote a distorted version of \({\mathbf{x}}^{l}\). Then TXY \(\mathop \wedge \limits^{\exp }\)\({\tilde{\mathbf{x}}}^{l}\) = \( {\mathbf{y}}^{l} \), if and only.

$$ \widetilde{x}_{j}^{l} \ge x_{j}^{l} \wedge \mathop \vee \limits_{i = 1}^{m} \left( {\mathop \wedge \limits_{\xi \ne l} \mathop {\left( {x_{{j_{i} }}^{\xi } } \right)}\nolimits^{{\log_{{y_{i}^{\xi } }} y_{i}^{l} }} } \right) \, \forall j = 1, \ldots ,n $$
(29)

and for each row index i ∈ {1, …, m}, there is a column index ji ∈ {1, …, n} such that

$$\widetilde{x}_{{j_{i} }}^{l} = x_{{j_{i} }}^{l} \wedge \left( {\mathop \wedge \limits_{\xi \ne l} \mathop {\left( {x_{{j_{i} }}^{\xi } } \right)}\nolimits^{{\log_{{y_{i}^{\xi } }} y_{i}^{l} }} } \right) \, $$
(30)

Proof of Theorem 4

Before proving Theorem 4, let us introduce a common formula for exponential and logarithmic operations:

$$ a^{{\log_{b} c}} = c^{{\log_{b} a}} $$
(31)

Let

$$ a^{{\log_{b} c}} = A \Leftrightarrow \log_{b} c = \log_{a} A = \frac{{\log_{c} A}}{{\log_{c} a}} \Leftrightarrow \log_{c} A = \log_{c} a \cdot \log_{b} c = \log_{b} a $$
(32)

Namely \(A = c^{{\log_{b} a}}\).

  1. (a)

    Suppose that \({\tilde{\mathbf{x}}}^{l}\) denotes a distorted version of \({\mathbf{x}}^{l}\) and that for l = 1, …, k, TXY \(\mathop \wedge \limits^{\exp }\)\( \tilde{x}^{l} \) = \( {\mathbf{y}}^{l} \). Then

    $$y_{i}^{l} = (T_{{XY}} \mathop \wedge \limits^{{\exp }} \tilde{x}^{l} )_{i} = \mathop \wedge \limits_{{r = 1}}^{n} (\widetilde{x}_{r}^{l} )^{{t_{{ir}} }} \le (\tilde{x}_{j}^{l} )^{{t_{{ij}} }} \quad \forall_{i} = 1, \ldots , m\,and \;\forall_{j} = 1\ldots n$$
    (33)

    Namely

    $$ \begin{aligned} {\text{log}}_{{\widetilde{x}_{j}^{l} }} y_{i}^{l} \le & {\text{log}}_{{\widetilde{x}_{j}^{l} }} (\widetilde{x}_{j}^{l} )^{{t_{ij} }} { = }t_{ij} \, \forall i{ = 1,} \ldots ,m, \, \forall j = 1, \ldots ,n \\ \Leftrightarrow & \frac{{{\text{t}}_{ij} }}{{{\text{log}}_{{\widetilde{x}_{j}^{l} }} y_{i}^{l} }} \ge {1 }\forall i{ = 1,} \ldots ,m, \, \forall j = 1, \ldots ,n \\ \Leftrightarrow & {\text{t}}_{ij}{\cdot}\text{log}_{{y_{i}^{l} }} \widetilde{x}_{j}^{l} \ge {1} \Leftrightarrow \widetilde{x}_{j}^{{l}{{\text{t}}_{ij} }} \ge (y_{i}^{l} {)} \Leftrightarrow \widetilde{x}_{j}^{l} \ge (y_{i}^{l} {)}^{{1/{\text{t}}_{ij} }} \, \forall i{ = 1,} \ldots ,m, \, \forall j = 1, \ldots ,n \\ \Leftrightarrow & \widetilde{x}_{j}^{l} \ge \mathop \vee \limits_{i = 1}^{m} (y_{i}^{l} {)}^{{\frac{1}{{{\text{t}}_{ij} }}}} { = }\mathop \vee \limits_{i = 1}^{m} (y_{i}^{l} {)}^{{\frac{1}{{\mathop \vee \limits_{\xi = 1}^{k} \log_{{x_{j}^{\xi } }} y_{i}^{\xi } }}}} { = }\mathop \vee \limits_{i = 1}^{m} (y_{i}^{l} {)}^{{\mathop \wedge \limits_{\xi = 1}^{k} \log_{{y_{i}^{\xi } }} x_{j}^{\xi } }} \, \forall j = 1, \ldots ,n \\ \Leftrightarrow & \widetilde{x}_{j}^{l} \ge \mathop \vee \limits_{i = 1}^{m} (y_{i}^{l} {)}^{{{(\mathop \wedge \limits_{\xi \ne l}{}} \log_{{y_{i}^{\xi } }} x_{j}^{\xi } ) \wedge \log_{{y_{i}^{l} }} x_{j}^{l} }} \, \forall j = 1, \ldots ,n \\ \Leftrightarrow & \widetilde{x}_{j}^{l} \ge \mathop \vee \limits_{i = 1}^{m} [(y_{i}^{l} {)}^{{{\mathop \wedge \limits_{\xi \ne l}{}} \log_{{y_{i}^{\xi } }} x_{j}^{\xi } }} \wedge x_{j}^{l} {] = }\,x_{j}^{l} \wedge \mathop \vee \limits_{i = 1}^{m} [\mathop \wedge \limits_{\xi \ne l}^{{}} (y_{i}^{l} {)}^{{\log_{{y_{i}^{\xi } }} x_{j}^{\xi } }} ] \, \forall j = 1, \ldots ,n \\ \end{aligned} $$

    According to Eq. (5) we have

    $$ \widetilde{x}_{j}^{l} \ge x_{j}^{l} \wedge \mathop \vee \limits_{i = 1}^{m} [\mathop \wedge \limits_{\xi \ne l}^{{}} (x_{j}^{\xi } {)}^{{\log_{{y_{i}^{\xi } }} y_{i}^{l} }} ] \, \forall j = 1, \ldots ,n $$
    (34)

    This shows that the inequation (3) are satisfied. It also follows that

    $$ \widetilde{x}_{j}^{l} \ge x_{j}^{l} \wedge [\mathop \wedge \limits_{\xi \ne l}^{{}} (x_{j}^{\xi } {)}^{{\log_{{y_{i}^{\xi } }} y_{i}^{l} }} ] \, \forall j = 1, \ldots ,n,\forall i = 1, \ldots ,m $$
    (35)

    Suppose that the set of inequations given by (9) does not contain an equation for i = 1, …, m; i.e., assume that there exists a row index i ∈ {1, …, m} such that

    $$ \widetilde{x}_{j}^{l} > x_{j}^{l} \wedge [\mathop \wedge \limits_{\xi \ne l}^{{}} (x_{j}^{\xi } {)}^{{\log_{{y_{i}^{\xi } }} y_{i}^{l} }} ] = x_{j}^{l} \wedge [\mathop \wedge \limits_{\xi \ne l}^{{}} (y_{i}^{l} {)}^{{\log_{{y_{i}^{\xi } }} x_{j}^{\xi } }} ] \, \forall j = 1, \ldots ,n $$
    (36)

    Then

    $$ \begin{aligned} ({\mathbf{T}}_{{{\text{XY}}}} \mathop \wedge \limits^{\exp } \widetilde{{\varvec{x}}}^{l} )_{i} = & \mathop \wedge \limits_{j = 1}^{n} (\widetilde{x}_{j}^{l} )^{{t_{ij} }} > \mathop \wedge \limits_{j = 1}^{n} (x_{j}^{l} \wedge [\mathop \wedge \limits_{\xi \ne l}^{{}} (y_{i}^{l} {)}^{{\log_{{y_{i}^{\xi } }} x_{j}^{\xi } }} ])^{{t_{ij} }} = \mathop \wedge \limits_{j = 1}^{n} [\mathop \wedge \limits_{\xi = 1}^{k} (y_{i}^{l} {)}^{{\log_{{y_{i}^{\xi } }} x_{j}^{\xi } }} ]^{{t_{ij} }} \\ = & \mathop \wedge \limits_{j = 1}^{n} [(y_{i}^{l} {)}^{{\mathop \wedge \limits_{\xi = 1}^{k} \log_{{y_{i}^{\xi } }} x_{j}^{\xi } }} ]^{{t_{ij} }} = \mathop \wedge \limits_{j = 1}^{n} [(y_{i}^{l} {)}^{{\frac{1}{{\mathop \vee \limits_{\xi = 1}^{k} \log_{{x_{j}^{\xi } }} y_{i}^{\xi } }}}} ]^{{t_{ij} }} \\ = & \mathop \wedge \limits_{j = 1}^{n} [(y_{i}^{l} {)}^{{\frac{1}{{t_{ij} }}}} ]^{{t_{ij} }} = y_{i}^{l} \\ \end{aligned} $$
    (37)

    Therefore, TXY \(\mathop \wedge \limits^{\exp }\)\( \tilde{x}^{l} \) > \({\varvec{y}}^l\), this contradicts the hypothesis that TXY \(\mathop \wedge \limits^{\exp }\)\( \tilde{x}^{l} \) = \({\varvec{y}}^l\). It follows that for each row index i, there must exist a column index ji satisfying the Eq. (4).

  2. (b)

    Suppose that

    $$ \widetilde{x}_{j}^{l} \ge x_{j}^{l} \wedge \mathop \vee \limits_{i = 1}^{m} [\mathop \wedge \limits_{\xi \ne l}^{{}} (x_{j}^{\xi } {)}^{{\log_{{y_{i}^{\xi } }} y_{i}^{l} }} ] \, \forall j = 1, \ldots ,n $$
    (38)

According to the proof in part (a), the inequality is true if and only if

$$ \widetilde{x}_{j}^{l} \ge (y_{i}^{l} {)}^{{\frac{1}{{{\text{t}}_{ij} }}}} \, \forall i{ = 1,} \ldots ,m, \, \forall j = 1, \ldots ,n $$
(39)

or, equivalently, if and only if

$$ \begin{aligned} (\widetilde{x}_{j}^{l} )^{{t_{ij} }} \ge & y_{i}^{l} \, \forall i{ = 1,} \ldots ,m, \, \forall j = 1, \ldots ,n \\ \Leftrightarrow & \mathop \wedge \limits_{j = 1}^{n} (\widetilde{x}_{j}^{l} )^{{t_{ij} }} \ge y_{i}^{l} \, \forall i{ = 1,} \ldots ,m \\ \Leftrightarrow & ({\mathbf{T}}_{{{\text{XY}}}} \mathop \wedge \limits^{\exp } \widetilde{{\varvec{x}}}_{{}}^{l} )_{i} \ge y_{i}^{l} \, \forall i{ = 1,} \ldots ,m \\ \end{aligned} $$
(40)

Which implies that TXY \(\mathop \wedge \limits^{\exp }\)\( \tilde{x}^{l} \) ≥ \({\varvec{y}}^l\), \(\forall\) l = 1, …, k. Next, if we can show that TXY \(\mathop \wedge \limits^{\exp }\)\( \tilde{x}^{l} \) ≤ \({\varvec{y}}^l\), \(\forall\) l = 1, …, k, then we must have that TXY \(\mathop \wedge \limits^{\exp }\)\( \tilde{x}^{l} \) = \({\varvec{y}}^l\), \(\forall\) l = 1, …, k.

Let l ∈ {1, …, k} and i ∈ {1, …, m} be arbitrarily chosen. Then

$$ \begin{aligned} ({\mathbf{T}}_{{{\text{XY}}}} \mathop \wedge \limits^{\exp } \widetilde{{\varvec{x}}}_{{}}^{l} )_{i} = & \mathop \wedge \limits_{j = 1}^{n} \left( {\widetilde{x}_{j}^{l} } \right)^{{t_{ij} }} \le \left( {\widetilde{x}_{{j_{i} }}^{l} } \right)^{{t_{{ij_{i} }} }} { = }\left[ {x_{{j_{i} }}^{l} \wedge \left( {\mathop \wedge \limits_{\xi \ne l} \mathop {\left( {x_{{j_{i} }}^{\xi } } \right)}\nolimits^{{\log_{{y_{i}^{\xi } }} y_{i}^{l} }} } \right)} \right]^{{t_{{ij_{i} }} }} \\ = & \left( {x_{{j_{i} }}^{l} \wedge \left[ {\mathop \wedge \limits_{\xi \ne l}^{{}} \left( {y_{i}^{l} } \right)^{{\log_{{y_{i}^{\xi } }} x_{{j_{i} }}^{\xi } }} } \right]} \right)^{{t_{{ij_{i} }} }} = \left[ {\mathop \wedge \limits_{\xi = 1}^{k} \left( {y_{i}^{l} } \right)^{{\log_{{y_{i}^{\xi } }} x_{{j_{i} }}^{\xi } }} } \right]^{{t_{{ij_{i} }} }} \\ = & \left[ {\left( {y_{i}^{l} } \right)^{{\mathop \wedge \limits_{\xi = 1}^{k} \log_{{y_{i}^{\xi } }} x_{{j_{i} }}^{\xi } }} } \right]^{{t_{{ij_{i} }} }} { = }\left[ {\left( {y_{i}^{l} } \right)^{{^{{\frac{1}{{\mathop \vee \limits_{\xi = 1}^{k} \log_{{x_{{j_{i} }}^{\xi } }} y_{i}^{\xi } }}}} }} } \right]^{{t_{{ij_{i} }} }} \\ = & \left[ {\left( {y_{i}^{l} } \right)^{{^{{\frac{1}{{t_{{ij_{i} }} }}}} }} } \right]^{{t_{{ij_{i} }} }} = y_{i}^{l} \\ \end{aligned} $$
(41)

This shows that TXY \(\mathop \wedge \limits^{\exp }\)\( \tilde{x}^{l} \) ≤ \({\varvec{y}}^l\), \(\forall\) l = 1, …, k.

Theorem 4′ is proved. Similarly, we can prove Theorem 3′. When Ο  = − and Θ =  + or Ο = • and Θ = /, readers can prove these two Theorems by themselves in a similar way. These two theorems provide bounds for the amount of distortion of the exemplar patterns xl for which perfect recall can be assured.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Feng, N., Geng, X. & Sun, B. Study on modeling implicit learning based on MAM framework. Artif Intell Rev 54, 4799–4825 (2021). https://doi.org/10.1007/s10462-021-10019-x

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10462-021-10019-x

Keywords

Navigation