Skip to main content

Neural Nets for Short Movements in Natural Language Processing

  • Conference paper
  • First Online:
Artificial Neural Networks — ICANN 2001 (ICANN 2001)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 2130))

Included in the following conference series:

Abstract

A neural model is constructed able to generate short sentences and their moved (question) forms, based on modern linguistics (X-bar theory) and a cartoon version of frontal lobes. Future extensions are mentioned in conclusion.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. Kempson R, Meyer-Viol W & Gabbay D (2001) Dynamic Syntax: The Flow of Language Understanding. Oxford: Blackwell

    Google Scholar 

  2. Kempson R & Meyer-Viol W (2001) The Dynamics of Wh-Questions. KCL preprint.

    Google Scholar 

  3. Gorrell P (1995) Syntax and parsing. Cambridge: Cambridge University Press.

    Google Scholar 

  4. Frank R (1998) Structural complexity and the time course of grammatical development Cognition 66:249–301.

    Google Scholar 

  5. Hahne A & Friederici AD (1999) Electrophysical evidence for two steps in syntactic analysis: early automatic and late controlled processes. Journal of Cognitive Neuroscience 11:194–205

    Article  Google Scholar 

  6. Taylor JG, Taylor NR, Apolloni B & Orovas C (2000) Constructing symbols as manipulable structures by recurrent networks. IJCNN00, paper NN06281

    Google Scholar 

  7. Taylor NR & Taylor JG (2000) Hard-wired models of working memory and temporal sequence storage and generation. Neural Networks 13:201–224

    Article  Google Scholar 

  8. Taylor JG & Taylor NR (2000) Analysis of recurrent cortico-basal ganglia-thalamic loops for working memory. Biological Cybernetics 82:415–432

    Article  MATH  Google Scholar 

  9. Haegeman L (1994) Introduction to Government and Binding Theory (2nd ed) Oxford: Blackwell

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2001 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Taylor, N., Taylor, J. (2001). Neural Nets for Short Movements in Natural Language Processing. In: Dorffner, G., Bischof, H., Hornik, K. (eds) Artificial Neural Networks — ICANN 2001. ICANN 2001. Lecture Notes in Computer Science, vol 2130. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-44668-0_168

Download citation

  • DOI: https://doi.org/10.1007/3-540-44668-0_168

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-42486-4

  • Online ISBN: 978-3-540-44668-2

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics