Skip to main content

Advertisement

Log in

What is this article about? Generative summarization with the BERT model in the geosciences domain

  • Research Article
  • Published:
Earth Science Informatics Aims and scope Submit manuscript

Abstract

In recent years, a large amount of data has been accumulated, such as those recorded in geological journals and report literature, which contain a wealth of information, but these data have not been fully exploited or mined. Automatic information extraction offers an effective way to achieve new discoveries and pursue further analysis, which is of great significance for users, researchers or decision makers to aid and support analysis. In this paper, we utilize the bidirectional encoder representations from transformers (BERT) model, which is fine-tuned and then applied to automatically generate the title of a given input summarization based on the collection of published literature samples. The framework contains an encoder module, decoder module and training module. The core stages of summary generation involve the combination of encoder and decoder modules, and the multi-stage function is then used to connect modules, thus endowing the text summarization model with a multi-task learning architecture. Compared to other baseline models, our proposed model obtains the best results on the constructed dataset. Therefore, based on the proposed model, an automatic geological briefing generation platform is developed and used as an online platform to support the excavation of key areas and a visual presentation analysis of the literature.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

Similar content being viewed by others

References

  • Al-Abdallah RZ, Al-Taani AT (2017) Arabic single-document text summarization using particle swarm optimization algorithm[J]. Procedia Comput Sci 117:30–37

    Article  Google Scholar 

  • Bhat IK , Mohd M , Hashmy R (2018) SumItUp: a hybrid single-document text summarizer[M]

  • Carbonell J , Goldstein J (1998) The use of MMR, diversity-based reranking for reordering documents and producing summaries. ACM:335–336

  • Ceylan H , Mihalcea R , O'Zertem U, et al. (2010) Quantifying the limits and success of extractive summarization systems across domains[C]// human language technologies: the conference of the north American chapter of the Association for Computational Linguistics. Association for Computational Linguistics

  • Chen YC , Bansal M (2018) Fast abstractive summarization with reinforce-selected sentence rewriting[C]// proceedings of the 56th annual meeting of the Association for Computational Linguistics (volume 1: long papers)

  • Cheng J, Lapata M (2016) Neural summarization by extracting sentences and words. Proceedings of the 54th annual meeting of the Association for Computational Linguistics, Berlin, pp 484–494

    Google Scholar 

  • Devlin J, Chang M W, Lee K, et al (2018) BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding[J]. arXiv preprint arXiv:1810.04805

  • El-Kassas WS, Salama CR, Rafea AA et al (2020) Automatic text summarization: a comprehensive survey[J]. Expert Syst Appl 113679

  • fxsjy (2018) https://github.com/fxsjy/jieba

  • Grusky M, Naaman M, Artzi Y (2018) NEWSROOM: A dataset of 1.3 million summarieswith diverse extractive strategies. Proceedings of the 16th Annual Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, New Orleans

    Google Scholar 

  • Hou S, Lu R (2020) Knowledge-guided unsupervised rhetorical parsing for text summarization[J]. Inf Syst :101615

  • Hou L, Hu P, Bei C (2017) Abstractive document summarization via neural model with joint attention. Paper presented at the natural language processing and Chinese computing, Dalian

    Google Scholar 

  • Howard J, Ruder S (2018) Universal language model fine-tuning for text classification. Proceedings of the 56th annual meeting of the association for computational linguistics (volume 1: long papers):328–39

  • Hu B & Chen Q, Zhu F (2015) LCSTS: A Large Scale Chinese Short Text Summarization Dataset. https://doi.org/10.18653/v1/D15-1229

  • Hunter J, Freer Y, Gatt A et al (2012) Automatic generation of natural language nursing shift summaries in neonatal intensive care: BT-nurse[J]. Artif Intell Med 56(3):157–172

    Article  Google Scholar 

  • Joshi M , Hui W, Mcclean S (2018) Dense semantic graph and its application in single document summarisation. Emerging Ideas on Information Filtering and Retrieval

  • Jun Q (2019) Hybrid text summarization model based on reinforcement learning[J]. Information Technol Inform Technol 226(01):67–70

    Google Scholar 

  • Kingma, D, Ba J (2014) Adam: a method for stochastic optimization. International Conference on Learning Representations

  • Liang Z , Du J , Li C (2020) Abstractive social media text summarization using selective reinforced Seq2Seq attention model[J]. Neurocomputing, 410

  • Lin C-Y (2004) ROUGE: a package for automatic evaluation of summaries. Proceedings of the ACL Workshop: Text Summarization Braches Out 10

  • Lin J, Sun X, Ma S, Su Q (2018) Global Encoding for Abstractive Summarization. 163–169. https://doi.org/10.18653/v1/P18-2027

  • Liu B (2012) Sentiment analysis and opinion mining[J]. Synthesis Lectures Human Language Technol 5(1):160–167

    Google Scholar 

  • Liu Y (2019) Fine-tune BERT for extractive summarization[J]

  • Lu R, Wang T, Zeng BQ, Liu X (2020) TSPT: a three-stage composite text summarization model based on pre-training. Appl Res Comput 37(10):2917–2921

  • Ma X (2019) Geo-Data Science: Leveraging Geoscience Research with Geoinformatics, Semantics and Open Data. Acta Geologica Sinica 93:44–47. https://doi.org/10.1111/1755-6724.14240

    Article  Google Scholar 

  • Ma S, Sun X, Lin J, Wang H (2018) Autoencoder as Assistant Supervisor: Improving Text Representation for Chinese Social Media Text Summarization

  • Ma X, Ma C, Wang C (2020) A new structure for representing and tracking version information in a deep time knowledge graph. Comput Geosci 145:104620. https://doi.org/10.1016/j.cageo.2020.104620

    Article  Google Scholar 

  • Mao X, Yang H, Huang S et al (2019) Extractive Summarization Using Supervised and Unsupervised Learning[J]. Expert Syst Appl 133:173–181

    Article  Google Scholar 

  • Mohan MJ, Sunitha C, Ganesh A, Jaya A (2016) A study on ontology based abstractive summarization. Procedia Comput Sci 87:32–37

    Article  Google Scholar 

  • Nallapati R, Xiang B , Zhou B (2016a) Sequence-to-sequence RNNs for text summarization[J]

  • Nallapati R, Zhai F, Zhou B (2016b) SummaRuNNer: A Recurrent Neural Network based Sequence Model for Extractive Summarization of Documents

  • Narayan S , Cohen SB, Lapata M (2018a) Don't give me the details, just the summary! Topic-aware convolutional neural networks for extreme summarization[J]

  • Narayan S , Cohen SB , Lapata M (2018b) Don't give me the details, just the summary! Topic-aware convolutional neural networks for extreme summarization[J]

  • Narayan S, Cardenas R, Topoulos NP, Cohen SB, Lapata M, Yu JS, Chang Y (2018c) Document modeling with external attention for sentence extraction. Proceedings of the 56th annual meeting of the Association for Computational Linguistics, Melbourne

    Book  Google Scholar 

  • Nenkova A , Mckeown K (2012) A survey of text summarization techniques[J]. Springer US

  • Peters M, Neumann M, Iyyer M, Gardner M, Clark C, Lee K, et al. (2018) Deep contextualized word representations. Proceedings of the 2018 conference of the North American chapter of the association for computational linguistics: human language technologies, volume 1 (long papers):2227–37

  • Qiu Q, Xie Z, Wu L, Wenjia L (2018a) DGeoSegmenter: A dictionary-based Chinese word segmenter for the geoscience domain. Comput Geosci 121. https://doi.org/10.1016/j.cageo.2018.08.006

  • Qiu Q, Xie Z, Wu L (2018b) A cyclic self-learning Chinese word segmentation for the geoscience domain. Geomatica. https://doi.org/10.1139/geomatica-2018-0007

  • Qiu Q, Xie Z, Wu L, Wenjia L (2019) Geoscience Keyphrase Extraction Algorithm Using Enhanced Word Embedding Expert Systems with Applications 125. https://doi.org/10.1016/j.eswa.2019.02.001

  • Qiu Q, Xie Z, Wu L, Tao L (2020) Automatic spatiotemporal and semantic information extraction from unstructured geoscience reports using text mining techniques Earth Sci Inform 13. https://doi.org/10.1007/s12145-020-00527-9

  • Radev DR (2004) LexRank: graph-based lexical centrality as salience in text summarization[J]. J Qiqihar Junior Teachers College 22:2004

    Google Scholar 

  • Radford A, Narasimhan K, Salimans T, Sutskever I (2018) Improving language understanding by generative pre-training

  • Rush AM, Chopra S, Weston J (2015) A Neural Attention Model for Abstractive Sentence Summarization[J]. Computer Science. Sequence model for extractive summarization of documents. In Proceedings of the 31st AAAI conference on artificial intelligence, pages 3075–3081, San Francisco

  • Sandhaus E (2008) The New York Times Annotated Corpus. Linguistic Data Consortium, Philadelphia, 6(12)

  • Shuai W , Xiang Z , Bo L , et al. (2017) Integrating extractive and abstractive models for long text summarization[C]// 2017 IEEE international congress on big data (BigData congress). IEEE

  • Siddiqui T , Shamsi J A (2018) Generating abstractive summaries using sequence to sequence attention model[C]// Frontiers of information technology. IEEE Comput Soc

  • Sutskever I , Vinyals O , Le Q V (2014) Sequence to sequence learning with neural networks[J]. NIPS

  • Tan J , Wan X , Xiao J (2017) Abstractive document summarization with a graph-based attentional neural model[C]// meeting of the Association for Computational Linguistics

  • Wang C, Hazen R, Cheng Q, Stephenson M, Zhou C, Fox P, Shen S, Oberhänsli R, Hou Z, Ma X, Feng Z, Fan J, Ma C, Hu X, Luo B, Wang J (2021) The deep-time digital earth program: data-driven discovery in geosciences. Natl Sci Rev. https://doi.org/10.1093/nsr/nwab027

  • Wenjia L, Ma K, Qiu Q, Wu L, Xie Z, Li S, Chen S (2021) Chinese Word Segmentation Based on Self-Learning Model and Geological Knowledge for the Geoscience Domain Earth and Space Science 8. https://doi.org/10.1029/2021EA001673

  • Zhang H , Gong Y , Yan Y , et al. (2019) Pretraining-based natural language generation for text summarization[J]

  • Zhou Q , Yang N , Wei F , et al. (2018) Neural document summarization by jointly learning to score and select sentences

Download references

Acknowledgements

We would like to thank the anonymous reviewers for carefully reading this paper and their very useful comments. This study was financially supported by the National Natural Science Foundation of China (42050101, U1711267, 41871311, 41871305), National Key Research and Development Program (2018YFB0505500, 2018YFB0505504) and the Fundamental Research Funds for the Central Universities, China University of Geosciences (Wuhan) (No. CUG2106116)).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Qinjun Qiu.

Ethics declarations

Conflict of interest

The authors declare no conflicts of interest.

Additional information

Communicated by: H. Babaie

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Ma, K., Tian, M., Tan, Y. et al. What is this article about? Generative summarization with the BERT model in the geosciences domain. Earth Sci Inform 15, 21–36 (2022). https://doi.org/10.1007/s12145-021-00695-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12145-021-00695-2

Keywords

Navigation