loading
Papers Papers/2022 Papers Papers/2022

Research.Publish.Connect.

Paper

Authors: Michael Richter 1 ; Maria Bardají I. Farré 2 ; Max Kölbl 1 ; Yuki Kyogoku 1 ; J. Nathanael Philipp 1 ; Tariq Yousef 1 ; Gerhard Heyer 1 and Nikolaus P. Himmelmann 2

Affiliations: 1 Institute of Computer Science, Natural Language Processing Group, Leipzig University, Germany ; 2 Institute of Linguistics, University of Cologne, Germany

Keyword(s): Dependency Structures, Uniform Information Density, Universal Dependencies.

Abstract: This pilot study addresses the question of whether the Uniform Information Density principle (UID) can be proved for eight typologically diverse languages. The lexical information of words is derived from dependency structures both in sentences preceding the sentences and within the sentence in which the target word occurs. Dependency structures are a realisation of extra-sentential contexts for deriving information as formulated in the surprisal model. Only subject, object and oblique, i.e., the level directly below the verbal root node, were considered. UID says that in natural language, the variance of information and information jumps from word to word should be small so as not to make the processing of a linguistic message an insurmountable hurdle. We observed cross-linguistically different information distributions but an almost identical UID, which provides evidence for the UID hypothesis and assumes that dependency structures can function as proxies for extra-sentential conte xts. However, for the dependency structures chosen as contexts, the information distributions in some languages were not statistically significantly different from distributions from a random corpus. This might be an effect of too low complexity of our model’s dependency structures, so lower hierarchical levels (e.g. phrases) should be considered. (More)

CC BY-NC-ND 4.0

Sign In Guest: Register as new SciTePress user now for free.

Sign In SciTePress user: please login.

PDF ImageMy Papers

You are not signed in, therefore limits apply to your IP address 3.15.221.67

In the current month:
Recent papers: 100 available of 100 total
2+ years older papers: 200 available of 200 total

Paper citation in several formats:
Richter, M.; Bardají I. Farré, M.; Kölbl, M.; Kyogoku, Y.; Philipp, J.; Yousef, T.; Heyer, G. and Himmelmann, N. (2022). Uniform Density in Linguistic Information Derived from Dependency Structures. In Proceedings of the 14th International Conference on Agents and Artificial Intelligence - Volume 1: NLPinAI; ISBN 978-989-758-547-0; ISSN 2184-433X, SciTePress, pages 496-503. DOI: 10.5220/0010969600003116

@conference{nlpinai22,
author={Michael Richter. and Maria {Bardají I. Farré}. and Max Kölbl. and Yuki Kyogoku. and J. Nathanael Philipp. and Tariq Yousef. and Gerhard Heyer. and Nikolaus P. Himmelmann.},
title={Uniform Density in Linguistic Information Derived from Dependency Structures},
booktitle={Proceedings of the 14th International Conference on Agents and Artificial Intelligence - Volume 1: NLPinAI},
year={2022},
pages={496-503},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0010969600003116},
isbn={978-989-758-547-0},
issn={2184-433X},
}

TY - CONF

JO - Proceedings of the 14th International Conference on Agents and Artificial Intelligence - Volume 1: NLPinAI
TI - Uniform Density in Linguistic Information Derived from Dependency Structures
SN - 978-989-758-547-0
IS - 2184-433X
AU - Richter, M.
AU - Bardají I. Farré, M.
AU - Kölbl, M.
AU - Kyogoku, Y.
AU - Philipp, J.
AU - Yousef, T.
AU - Heyer, G.
AU - Himmelmann, N.
PY - 2022
SP - 496
EP - 503
DO - 10.5220/0010969600003116
PB - SciTePress