Skip to main content

EEG-Based Person Authentication with Variational Universal Background Model

  • Conference paper
  • First Online:

Part of the book series: Lecture Notes in Computer Science ((LNSC,volume 11928))

Abstract

Silent speech is a convenient and natural way for person authentication as users can imagine speaking their password instead of typing it. However there are inherent noises and complex variations in EEG signals making it difficult to capture correct information and model uncertainty. We propose an EEG-based person authentication framework based on a variational inference framework to learn a simple latent representation for complex data. A variational universal background model is created by pooling the latent models of all users. A likelihood ratio of user claimed model to the background model is constructed for testing whether the claim is valid. Extensive experiments on three datasets show the advantages of our proposed framework.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  1. Abdelfattah, S.M., Abdelrahman, G.M., Wang, M.: Augmenting the size of EEG datasets using generative adversarial networks. In: 2018 International Joint Conference on Neural Networks (IJCNN), pp. 1–6. IEEE (2018)

    Google Scholar 

  2. Bashivan, P., Rish, I., Yeasin, M., Codella, N.: Learning representations from EEG with deep recurrent-convolutional neural networks. arXiv preprint arXiv:1511.06448 (2015)

  3. Bishop, C.M.: Mixture Density Networks (1994)

    Google Scholar 

  4. Dai, M., Zheng, D., Na, R., Wang, S., Zhang, S.: EEG classification of motor imagery using a novel deep learning framework. Sensors 19(3), 551 (2019)

    Article  Google Scholar 

  5. Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013)

  6. Lawhern, V.J., Solon, A.J., Waytowich, N.R., Gordon, S.M., Hung, C.P., Lance, B.J.: EEGNet: a compact convolutional neural network for eeg-based brain-computer interfaces. J. Neural Eng. 15(5), 056013 (2018)

    Article  Google Scholar 

  7. Marcel, S., del R Millán, J.: Person authentication using brainwaves (EEG) and maximum a posteriori model adaptation. IEEE Trans. Pattern Anal. Mach. Intell. 29(4), 743–752 (2007)

    Article  Google Scholar 

  8. Nguyen, C.H., Karavas, G.K., Artemiadis, P.: Inferring imagined speech using EEG signals: a new approach using riemannian manifold features. J. Neural Eng. 15(1), 016002 (2017)

    Article  Google Scholar 

  9. Nguyen, P., Tran, D., Le, T., Huang, X., Ma, W.: EEG-based person verification using multi-sphere SVDD and UBM. In: Pei, J., Tseng, V.S., Cao, L., Motoda, H., Xu, G. (eds.) PAKDD 2013, Part I. LNCS (LNAI), vol. 7818, pp. 289–300. Springer, Heidelberg (2013). https://doi.org/10.1007/978-3-642-37453-1_24

    Chapter  Google Scholar 

  10. Ofner, P., Schwarz, A., Pereira, J., Müller-Putz, G.R.: Upper limb movements can be decoded from the time-domain of low-frequency EEG. PloS one 12(8), e0182578 (2017)

    Article  Google Scholar 

  11. Pham, T., Ma, W., Tran, D., Nguyen, P., Phung, D.: Multi-factor EEG-based user authentication. In: 2014 International Joint Conference on Neural Networks (IJCNN), pp. 4029–4034. IEEE (2014)

    Google Scholar 

  12. Reynolds, D.A.: Comparison of background normalization methods for text-independent speaker verification. In: Fifth European Conference on Speech Communication and Technology (1997)

    Google Scholar 

  13. Reynolds, D.A., Quatieri, T.F., Dunn, R.B.: Speaker verification using adapted gaussian mixture models. Digit. Signal Process. 10(1–3), 19–41 (2000)

    Article  Google Scholar 

  14. Schirrmeister, R.T., et al.: Deep learning with convolutional neural networks for EEG decoding and visualization. Hum. Brain Mapp. 38(11), 5391–5420 (2017)

    Article  Google Scholar 

  15. Tran, D.T.: Fuzzy approaches to speech and speaker recognition. Ph.D. thesis, University of Canberra (2000)

    Google Scholar 

  16. Tran, N., Tran, D., Liu, S., Ma, W., Pham, T.: EEG-based person authentication system in different brain states. In: 2019 9th International IEEE/EMBS Conference on Neural Engineering (NER), pp. 1050–1053. IEEE (2019)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Dat Tran .

Editor information

Editors and Affiliations

A Appendix

A Appendix

1.1 A.1 KL Divergence Derivations

We state the first proposition without proof.

Proposition 3

Let \(f(x)=\mathcal {N}(x;\mu _{f},\varSigma _{f})\) and \(g(x)=\mathcal {N}(x;\mu _{g},\varSigma _{g})\) be two Gaussian distributions in \(\mathbb {R}^{n}.\) The Kullback-Leibler (KL) divergence between f(x) and g(x) is:

$$\begin{aligned} {D_{KL}}(f(x)\Vert g(x))=&-\frac{n}{2}\log \left( \frac{\det \varSigma _{f}}{\det \varSigma _{g}}\right) -\frac{n}{2}\\&+\frac{1}{2}\text {trace}\left( \varSigma _{f}\varSigma _{g}^{-1}\right) +\frac{1}{2}(\mu _{f}-\mu _{g})\varSigma _{g}^{-1}(\mu _{f}-\mu _{g})^{T} \end{aligned}$$

The following proposition provides the basis for the training objective of our proposed VGM model.

Proposition 4

The variational upper bound of the Kullback-Leibler divergence between a unimodal distribution f(x) and a mixture model \(g(x)=\sum _{k}\alpha _{k}g_{k}(x)\)  is:

$$ {D_{KL}}(f(x)\Vert g(x))\le -\log \sum _{k}\alpha _{k}\exp \left( -D_{k}\right) $$

where \(D_{k}=\int f(x)\log \frac{f(x)}{g_{k}(x)}dx\) is the KL divergence between f(x) and \(g_{k}(x)\), the unimodal component distribution of the mixture g(x).

Proof

We have

$$\begin{aligned} {D_{KL}}(f(x)\Vert g(x))=\int f(x)\log \frac{f(x)}{g(x)}dx&=\int f(x)\log f(x)dx-\int f(x)\log g(x)dx\end{aligned}$$
(18)
$$\begin{aligned}&=L_{f}-L_{g} \end{aligned}$$
(19)

where \(L_{f}=\int f(x)\log f(x)dx\) and \(L_{g}=\int f(x)\log \sum _{k}\alpha _{k}g_{k}(x)dx\), and the index k runs from 1 to K.

We will use variational method to lower bound \(L_{g}\). Let us introduce variational variables \(a_{1},\dots ,a_{K}\), with \(\sum _{k}a_{k}=1\), \(a_{k}>0\), \(k=1,\dots ,K\),

$$\begin{aligned} L_{g}&=\int f(x)\log \sum _{k}a_{k}\frac{\alpha _{k}g(x)}{a_{k}}dx\end{aligned}$$
(20)
$$\begin{aligned}&\ge \int f(x)\sum _{k}a_{k}\log \frac{\alpha _{k}g_{k}(x)}{a_{k}}dx=L'_{g} \end{aligned}$$
(21)

where we use Jenssen inequality in Eq. 21. We want to find the \(a_{k}\) that maximizes this lower bound. Since Eq. 21 is concave in each \(a_{k}\) it has a global maximum. Let take partial derivative of \(L'_{g}\) w.r.t. each \(a_{k}\) then set it to zero and let us denote \(D_{k}={D_{KL}}\left( f(x)\Vert g_{k}(x)\right) \) for brevity:

$$\begin{aligned} \frac{\partial L'_{g}}{\partial a_{k}}=0&=\int f(x)\left( \log \frac{\alpha _{k}g_{k}(x)}{a_{k}}+a_{k}(-\frac{1}{a_{k}})\right) dx\end{aligned}$$
(22)
$$\begin{aligned} 0&=\int f(x)\left( \log \alpha _{k}+\log g_{k}(x)-\log a_{k}-1\right) dx\end{aligned}$$
(23)
$$\begin{aligned} \log a_{k}&=\int f(x)\left( \log \alpha _{k}+\log g_{k}(x)-\log f(x) +\log f(x)-1\right) dx\end{aligned}$$
(24)
$$\begin{aligned} \log a_{k}&=\log \alpha _{k}-D_{k}+\int f(x)\left( \log f(x)-1\right) dx\end{aligned}$$
(25)
$$\begin{aligned} a_{k}&=\frac{\alpha _{k}\exp \left( -D_{k}\right) }{\exp \left( \int f(x)\left( 1-\log f(x)\right) dx\right) }\end{aligned}$$
(26)
$$\begin{aligned} \sum _{k=1}^{K}a_{k}=1&=\frac{\sum _{k}\alpha _{k}\exp \left( -D_{k}\right) }{\exp \left( \int f(x)\left( 1-\log f(x)\right) dx\right) }\end{aligned}$$
(27)
$$\begin{aligned} \exp (\int f(x)\left( 1-\log f(x)\right) dx)&=\sum _{k}\alpha _{k}\exp \left( -D_{k}\right) \end{aligned}$$
(28)

where we have used \(\int f(x)dx=1\) in Eq. 24.

From Eq. 26 and Eq. 28 we have:

$$\begin{aligned} a_{k}&=\frac{\alpha _{k}\exp \left( -D_{k}\right) }{Z}\end{aligned}$$
(29)
$$\begin{aligned} \text {where}\,\,Z&=\sum _{k=1}^{K}\alpha _{k}\exp \left( -D_{k}\right) \end{aligned}$$
(30)

Plug \(L'_{g}\) in Eq. 21 into Eq. 18:

$$\begin{aligned} {D_{KL}}(f(x)\Vert g(x))&=L_{f}-L_{g}\nonumber \\&\le L_{f}-L'_{g}\nonumber \\&=\int f(x)\log f(x)dx-\int f(x)\sum _{k}a_{k}\log \frac{\alpha _{k}g_{k}(x)}{a_{k}}dx\nonumber \\&=\int f(x)\left[ \log f(x)-\sum _{k}a_{k}\log \frac{\alpha _{k}g_{k}(x)}{a_{k}}\right] dx\nonumber \\&=\int f(x)\left[ \log f(x)-\sum _{k}a_{k}\log \frac{g_{k}(x)Z}{\exp \left( -{D_{KL}}_{k}\right) }\right] dx\end{aligned}$$
(31)
$$\begin{aligned}&=-\log Z+\int f(x)\left[ \log f(x)-\sum _{k}a_{k}\log \frac{g_{k}(x)}{\exp \left( -D_{k}\right) }\right] dx\end{aligned}$$
(32)
$$\begin{aligned}&=-\log Z+A \end{aligned}$$
(33)

where we substituted \(a_{k}=\frac{\alpha _{k}\exp \left( -D_{k}\right) }{Z}\) in Eq. 31 and move \(\log Z\) out of the integral in Eq. 32.

We will show that \(A=0\):

$$\begin{aligned} A&=\int f(x)\left[ \log f(x)-\sum _{k}a_{k}\log \frac{g_{k}(x)}{\exp \left( -D_{k}\right) }\right] dx\\&=\int f(x)\left[ \sum _{k}a_{k}\log f(x)-\sum _{k}a_{k}\log \frac{g_{k}(x)}{\exp \left( -D_{k}\right) }\right] dx\\&=\sum _{k}a_{k}\left[ \int f(x)\left( \log f(x)-\log g_{k}(x)\right) \right] dx-D_{k}\\&=\sum _{k}a_{k}D_{k}-D_{k}\\&=0 \end{aligned}$$

where we again used the facts that \(\int f(x)dx=1\) and \(\sum _{k}a_{k}=1\).

We have shown that \({D_{KL}}(f(x)\Vert g(x))\le L_{f}-L'_{g}=-\log Z\).    \(\blacksquare \)

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Tran, H., Tran, D., Ma, W., Nguyen, P. (2019). EEG-Based Person Authentication with Variational Universal Background Model. In: Liu, J., Huang, X. (eds) Network and System Security. NSS 2019. Lecture Notes in Computer Science(), vol 11928. Springer, Cham. https://doi.org/10.1007/978-3-030-36938-5_25

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-36938-5_25

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-36937-8

  • Online ISBN: 978-3-030-36938-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics