skip to main content
10.1145/3663976.3664029acmotherconferencesArticle/Chapter ViewAbstractPublication PagescvipprConference Proceedingsconference-collections
research-article

Judicial Text Relation Extraction Based on Prompt Tuning

Published: 27 June 2024 Publication History

Abstract

With the acceleration of the digital transformation in the judicial field, the need for automated processing and analysis of voluminous legal documents becomes increasingly imperative. Relationship extraction, a pivotal step in comprehending these documents, is crucial for automated case analysis, discovery of legal facts, and prediction of judgments. However, the idiosyncrasies of judicial texts, such as their intricate linguistic structures, specialized terminology, and implicit semantic relationships, present significant challenges to the efficacy of traditional text analysis methods. Traditional relation extraction techniques often fail to adapt well to professional terms, intricate linguistic structures and implicit semantics in judicial texts. Moreover, the general lack of training data tailored to the specific context of the legal domain hampers the ability of models to learn effectively and accurately extract relationships from judicial texts. This paper introduces a novel approach utilizing pre-trained models for prompt-based fine-tuning to achieve relationship extraction in judicial texts. Leveraging the robust language comprehension capabilities of large-scale pre-trained models, we have designed domain-specific prompt templates to guide the model in more effectively capturing and understanding key information within legal texts. This method not only enhances the accuracy of relationship extraction but also significantly reduces reliance on extensive annotated data. The effectiveness and superiority of our approach are corroborated by test results on a judicial text dataset.

References

[1]
Christoph Alt, Aleksandra Gabryszak, and Leonhard Hennig. 2020. TACRED revisited: A thorough evaluation of the TACRED relation extraction task. arXiv preprint arXiv:2004.14855 (2020).
[2]
Cail. [n.d.]. Cail.cipsc.org.cn.(n.d.).http://cail.cipsc.org.cn/Attend.html.
[3]
Ilias Chalkidis, Manos Fergadiotis, Prodromos Malakasiotis, Nikolaos Aletras, and Ion Androutsopoulos. 2020. LEGAL-BERT: The muppets straight out of law school. arXiv preprint arXiv:2010.02559 (2020).
[4]
Xiang Chen, Xin Xie, Ningyu Zhang, Jiahuan Yan, Shumin Deng, Chuanqi Tan, Fei Huang, Luo Si, and Huajun Chen. 2021. Adaprompt: Adaptive prompt-based finetuning for relation extraction. arXiv preprint arXiv:2104.07650 (2021).
[5]
Yanguang Chen, Yuanyuan Sun, Zhihao Yang, and Hongfei Lin. 2020. Joint entity and relation extraction for legal documents with legal feature enhancement. In Proceedings of the 28th International Conference on Computational Linguistics. 1561–1571.
[6]
Xu Han, Tianyu Gao, Yuan Yao, Demin Ye, Zhiyuan Liu, and Maosong Sun. 2019. OpenNRE: An open and extensible toolkit for neural relation extraction. arXiv preprint arXiv:1909.13078 (2019).
[7]
Iris Hendrickx, Su Nam Kim, Zornitsa Kozareva, Preslav Nakov, Diarmuid O Séaghdha, Sebastian Padó, Marco Pennacchiotti, Lorenza Romano, and Stan Szpakowicz. 2019. Semeval-2010 task 8: Multi-way classification of semantic relations between pairs of nominals. arXiv preprint arXiv:1911.10422 (2019).
[8]
Mandar Joshi, Danqi Chen, Yinhan Liu, Daniel S Weld, Luke Zettlemoyer, and Omer Levy. 2020. Spanbert: Improving pre-training by representing and predicting spans. Transactions of the association for computational linguistics 8 (2020), 64–77.
[9]
Xiaolin Li, Zhuohao Chen, Gang Xu, and Bowen Huang. 2021. Named entity recognition of legal documents based on cascade model. In 2021 International Symposium on Computer Technology and Information Science (ISCTIS). IEEE, 324–329.
[10]
Yinhan Liu, Myle Ott, Naman Goyal, Jingfei Du, Mandar Joshi, Danqi Chen, Omer Levy, Mike Lewis, Luke Zettlemoyer, and Veselin Stoyanov. 2019. Roberta: A robustly optimized bert pretraining approach. arXiv preprint arXiv:1907.11692 (2019).
[11]
Yatian Shen and Xuan-Jing Huang. 2016. Attention-based convolutional neural network for semantic relation extraction. In Proceedings of COLING 2016, the 26th International Conference on Computational Linguistics: Technical Papers. 2526–2536.
[12]
Thomas Wolf, Lysandre Debut, Victor Sanh, Julien Chaumond, Clement Delangue, Anthony Moi, Pierric Cistac, Tim Rault, Rémi Louf, Morgan Funtowicz, 2020. Transformers: State-of-the-art natural language processing. In Proceedings of the 2020 conference on empirical methods in natural language processing: system demonstrations. 38–45.
[13]
Ikuya Yamada, Akari Asai, Hiroyuki Shindo, Hideaki Takeda, and Yuji Matsumoto. 2020. Luke: deep contextualized entity representations with entity-aware self-attention. arXiv preprint arXiv:2010.01057 (2020).
[14]
Limin Yao, Aria Haghighi, Sebastian Riedel, and Andrew McCallum. 2011. Structured relation discovery using generative models. In proceedings of the 2011 conference on empirical methods in natural language processing. 1456–1466.
[15]
Yuhao Zhang, Victor Zhong, Danqi Chen, Gabor Angeli, and Christopher D Manning. 2017. Position-aware attention and supervised data improve slot filling. In Conference on Empirical Methods in Natural Language Processing.
[16]
Peng Zhou, Wei Shi, Jun Tian, Zhenyu Qi, Bingchen Li, Hongwei Hao, and Bo Xu. 2016. Attention-based bidirectional long short-term memory networks for relation classification. In Proceedings of the 54th annual meeting of the association for computational linguistics (volume 2: Short papers). 207–212.
[17]
Wenxuan Zhou and Muhao Chen. 2021. An improved baseline for sentence-level relation extraction. arXiv preprint arXiv:2102.01373 (2021).

Index Terms

  1. Judicial Text Relation Extraction Based on Prompt Tuning

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    CVIPPR '24: Proceedings of the 2024 2nd Asia Conference on Computer Vision, Image Processing and Pattern Recognition
    April 2024
    373 pages
    ISBN:9798400716607
    DOI:10.1145/3663976
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 27 June 2024

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. judicial text processing
    2. pre-training model
    3. prompt-tuning
    4. relation extraction

    Qualifiers

    • Research-article
    • Research
    • Refereed limited

    Funding Sources

    • Sichuan Science and Technology Program

    Conference

    CVIPPR 2024

    Acceptance Rates

    Overall Acceptance Rate 14 of 38 submissions, 37%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • 0
      Total Citations
    • 23
      Total Downloads
    • Downloads (Last 12 months)23
    • Downloads (Last 6 weeks)3
    Reflects downloads up to 20 Jan 2025

    Other Metrics

    Citations

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media