Abstract
Marked temporal events are ubiquitous in several areas, where the events’ times and marks (types) are usually interrelated. Point processes and their non-functional variations using recurrent neural networks (RNN) model temporal events using intensity functions. However, since they usually utilize the likelihood maximization approach, they might fail. Moreover, their high simulation complexity makes them inappropriate. Since calculating the intensity function is not always necessary, generative models are utilized for modeling. Generative adversarial networks (GANs) have been successful in modeling point processes, but they still lack in modeling interdependent types and times of events. In this research, a double Wasserstein GAN (WGAN), using a conditional GAN, is proposed which generates types of events that are categorical data, dependent on their times. Experiments on synthetic and real-world data represent that WGAN methods are efficient or competitive with the compared intensity-based models. Furthermore, these methods have a faster simulation than intensity-based methods.






Similar content being viewed by others
Data Availability
All the datasets supporting the findings of the current study are available in github.com/hongyuanmei/neurawkes. The synthetic data generated by the self-exciting multivariate point process model (SE-MPP) in [37] were used. The Retweet dataset is the preprocessed data by Mei and Eisner in [37], which was originally prepared in [50]. The Stack Overflow and MIMIC datasets were originally prepared by Du et al. in [2], which are also available in github.com/hongyuanmei/neurawkes.
References
Hochreiter S, Schmidhuber J (1997) Long short-term memory. Neural Comput 9(8):1735–1780. https://doi.org/10.1162/neco.1997.9.8.1735
Du N, Dai H, Trivedi R et al (2016) Recurrent marked temporal point processes: Embedding event history to vector. In: Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. Association for Computing Machinery, New York, NY, USA, KDD ’16, p 1555-1564, https://doi.org/10.1145/2939672.2939875
Arjovsky M, Bottou L (2017) Towards principled methods for training generative adversarial networks. stat 1050
Huszar F (2015) How (not) to train your generative model: Scheduled sampling, likelihood, adversary? CoRR abs/1511.05101. http://arxiv.org/abs/1511.05101
Theis L, van den Oord A, Bethge M (2016) A note on the evaluation of generative models. CoRR abs/1511.01844
Ogata Y (1981) On lewis’ simulation method for point processes. IEEE Trans Inf Theory 27:23–30
Xiao S, Farajtabar M, Ye X et al (2017) Wasserstein learning of deep generative point process models. In: Proceedings of the 31st International Conference on Neural Information Processing Systems. Curran Associates Inc., Red Hook, NY, USA, NIPS’17, p 3250-3259
Goodfellow I, Pouget-Abadie J, Mirza M et al (2014) Generative adversarial nets. In: Ghahramani Z, Welling M, Cortes C et al (eds) Advances in Neural Information Processing Systems, vol 27. Curran Associates, Inc., https://proceedings.neurips.cc/paper/2014/file/5ca3e9b122f61f8f06494c97b1afccf3-Paper.pdf
Goodfellow I (2016) NIPS 2016 tutorial: generative adversarial networks. arXiv e-prints arXiv:1701.00160. [cs.LG]
Salimans T, Goodfellow I, Zaremba W et al (2016) Improved techniques for training gans. In: Lee D, Sugiyama M, Luxburg U, et al (eds) Advances in Neural Information Processing Systems, vol 29. Curran Associates, Inc., https://proceedings.neurips.cc/paper/2016/file/8a3363abe792db2d8761d6403605aeb7-Paper.pdf
Arjovsky M, Chintala S, Bottou L (2017) Wasserstein generative adversarial networks. In: Proceedings of the 34th International Conference on Machine Learning - Volume 70. JMLR.org, ICML’17, p 214-223
Gulrajani I, Ahmed F, Arjovsky M et al (2017) Improved training of wasserstein gans. In: NIPS
Kodali N, Hays J, Abernethy JD et al (2018) On convergence and stability of gans. arXiv: Artificial Intelligence
Fu R, Chen J, Zeng S et al (2020) Time series simulation by conditional generative adversarial net. ERN: Time-Series Models (Single) (Topic)
Yoon J, Jarrett D, van der Schaar M (2019) Time-series generative adversarial networks. In: Wallach H, Larochelle H, Beygelzimer A et al (eds) Advances in Neural Information Processing Systems, vol 32. Curran Associates, Inc., https://proceedings.neurips.cc/paper/2019/file/c9efe5f26cd17ba6216bbe2a7d26d490-Paper.pdf
Saha A, Ganguly N (2020) A gan-based framework for modeling hashtag popularity dynamics using assistive information. In: Proceedings of the 29th ACM International Conference on Information and Knowledge Management. Association for Computing Machinery, New York, NY, USA, CIKM ’20, p 1335-1344, https://doi.org/10.1145/3340531.3412025
Mirza M, Osindero S (2014) Conditional generative adversarial nets. ArXiv abs/1411.1784
Engelmann J, Lessmann S (2021) Conditional wasserstein gan-based oversampling of tabular data for imbalanced learning. Exp Syst Appl 174(114):582. https://doi.org/10.1016/j.eswa.2021.114582 (www.sciencedirect.com/science/article/pii/S0957417421000233)
Kusner MJ, Hernández-Lobato JM (2016) Gans for sequences of discrete elements with the gumbel-softmax distribution. https://doi.org/10.48550/ARXIV.1611.04051
Camino R, Hammerschmidt C, State R (2018) Generating multi-categorical samples with generative adversarial networks https://doi.org/10.48550/ARXIV.1807.01202
Rocke D (2000) Genetic algorithms + data structures = evolution programs by z. michalewicz. J Am Stat Assoc 95:347–348. https://doi.org/10.2307/2669583
Kennedy J, Eberhart R (1995) Particle swarm optimization. In: Proceedings of ICNN’95 - International Conference on Neural Networks, pp 1942–1948 vol.4, https://doi.org/10.1109/ICNN.1995.488968
Dorigo M, Maniezzo V, Colorni A (1999) Ant system: an autocatalytic optimizing process technical report pp. 91-016
Colorni A, Dorigo M, Maniezzo V (1991) Distributed optimization by ant colonies
Abualigah L, Diabat A, Mirjalili S et al (2021) The arithmetic optimization algorithm. Comput Methods Appl Mech Eng 376(113):609. https://doi.org/10.1016/j.cma.2020.113609 (www.sciencedirect.com/science/article/pii/S0045782520307945)
Abualigah L, Yousri D, Abd Elaziz M et al (2021) Aquila optimizer: a novel meta-heuristic optimization algorithm. Comput Ind Eng 157(107):250. https://doi.org/10.1016/j.cie.2021.107250 (www.sciencedirect.com/science/article/pii/S0360835221001546)
Abualigah L, Elaziz MA, Sumari P et al (2022) Reptile search algorithm (rsa): nature-inspired meta-heuristic optimizer. Exp Syst Appl 191(116):158. https://doi.org/10.1016/j.eswa.2021.116158 (www.sciencedirect.com/science/article/pii/S0957417421014810)
Agushaka JO, Ezugwu AE, Abualigah L (2022) Dwarf mongoose optimization algorithm. Comput Methods Appl Mech Eng 391(114):570. https://doi.org/10.1016/j.cma.2022.114570 (www.sciencedirect.com/science/article/pii/S0045782522000019)
Oyelade ON, Ezugwu AES, Mohamed TIA, et al (2022) Ebola optimization search algorithm: a new nature-inspired metaheuristic optimization algorithm. IEEE Access 10:16,150–16,177. https://doi.org/10.1109/ACCESS.2022.3147821
Souza LA, Passos LA, Mendel R et al (2021) Fine-tuning generative adversarial networks using metaheuristics. In: Palm C, Deserno TM, Handels H et al (eds) Bildverarbeitung für die Medizin 2021. Springer Fachmedien Wiesbaden, Wiesbaden, pp 205–210
Perry P, Wolfe P (2010) Point process modeling for directed interaction networks. J Royal Statistical Soc 75. https://doi.org/10.1111/rssb.12013
Fox EW, Short MB, Schoenberg FP et al (2016) Modeling e-mail networks and inferring leadership using self-exciting point processes. J Am Stat Assoc 111(514):564–584. https://doi.org/10.1080/01621459.2015.1135802
Junuthula R, Haghdan M, Xu KS, et al (2019) The block point process model for continuous-time event-based dynamic networks. In: The World Wide Web Conference. Association for Computing Machinery, New York, NY, USA, WWW ’19, p 829-839, https://doi.org/10.1145/3308558.3313633
Arastuie M, Paul S, Xu K (2020) Chip: A hawkes process model for continuous-time networks with scalable and consistent estimation. In: Larochelle H, Ranzato M, Hadsell R, et al (eds) Advances in Neural Information Processing Systems, vol 33. Curran Associates, Inc., pp 16,983–16,996, https://proceedings.neurips.cc/paper/2020/file/c5a0ac0e2f48af1a4e619e7036fe5977-Paper.pdf
Wang Y, Liu S, Shen H, et al (2017) Marked temporal dynamics modeling based on recurrent neural network. pp 786–798, https://doi.org/10.1007/978-3-319-57454-7_61
Xiao S, Yan J, Farajtabar M, et al (2017) Joint modeling of event sequence and time series with attentional twin recurrent neural networks. arXiv: 1703.08524
Mei H, Eisner J (2017) The neural hawkes process: a neurally self-modulating multivariate point process. In: Guyon I, von Luxburg U, Bengio S, et al (eds) Advances in Neural Information Processing Systems 30: Annual Conference on Neural Information Processing Systems 2017, 4-9, 2017, Long Beach, CA, USA, pp 6754–6764, https://proceedings.neurips.cc/paper/2017/hash/6463c88460bd63bbe256e495c63aa40b-Abstract.html
Cai H, Nguyen T, Li Y et al (2020) Modeling marked temporal point process using multi-relation structure rnn. Cognitive Computation 12. https://doi.org/10.1007/s12559-019-09690-8
Abu-Srhan A, Abushariah MA, Al-Kadi OS (2022) The effect of loss function on conditional generative adversarial networks. J King Saud Univ- Comput Inf Sci. https://doi.org/10.1016/j.jksuci.2022.02.018
Chapfuwa P, Tao C, Li C et al (2018) Adversarial time-to-event modeling. Proceedings of machine learning research 80
Xu L, Skoularidou M, Cuesta-Infante A et al (2019) Modeling tabular data using conditional GAN. Curran Associates Inc., Red Hook, NY, USA
Zhao Z, Kunar A, Van der Scheer H et al (2021) Ctab-gan: effective table data synthesizing. https://doi.org/10.48550/ARXIV.2102.08369
Hawkes AG (1971) Spectra of some self-exciting and mutually exciting point processes. Biometrika 58(1):83–90 (http://www.jstor.org/stable/2334319)
Hawkes AG (1971) Spectra of some self-exciting and mutually exciting point processes. Biometrika 58(1):83–90 (http://www.jstor.org/stable/2334319)
Gulddahl Rasmussen J (2018) Lecture notes: temporal point processes and the conditional intensity function. arXiv e-prints arXiv:1806.00221. [stat.ME]
Lewis E, Mohler G (2011) A nonparametric em algorithm for multiscale hawkes processes. J Nonparamet Statistics
Villani C (2008) Optimal Transport: Old and New. Grundlehren der mathematischen Wissenschaften, Springer Berlin Heidelberg, https://books.google.com/books?id=hV8o5R7_5tkC
Gumbel EJ (1954) Statistical theory of extreme values and some practical applications; a series of lectures. Appl Math Series ; 33, U.S. Govt. Print. Office, Washington
Jang E, Gu SS, Poole B (2017) Categorical reparameterization with gumbel-softmax. ArXiv abs/1611.01144
Zhao Q, Erdogdu MA, He HY et al (2015) Seismic: a self-exciting point process model for predicting tweet popularity. In: Proceedings of the 21th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. Association for Computing Machinery, New York, NY, USA, KDD ’15, p 1513-1522, https://doi.org/10.1145/2783258.2783401
Funding
No funding was received to assist with the preparation of this manuscript.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors declare that there is no conflict of interest.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Dizaji, S.H.S., Pashazadeh, S. & Niya, J.M. Wasserstein generative adversarial networks for modeling marked events. J Supercomput 79, 2961–2983 (2023). https://doi.org/10.1007/s11227-022-04781-0
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11227-022-04781-0