Research paperMeasuring information transfer by dispersion transfer entropy
Introduction
Complex systems have been of considerable interest to the scientific community in recent years as the research on most issues can be summarized as the study of the properties and structures of complex systems [1], [2], [3]. Relations of different intensities and directions among components construct the internal structure of the complex system. Causality governs the most relationship between events [4]. In 1956, Norbert Wiener used climatology and neuroscience to assess correlations between signals, which inspired Clive Granger to develop the Granger causal relation. Broadly speaking, granger causality can be reduced to a conditional independent theoretical framework for evaluating the directional dependence between time series [5], [6], [7].
Wiener influenced not only Granger causality but also another area of dependence analysis, information theory. In information theory, the key measure of a discrete random variable is Shannon entropy, which quantifies the uncertainty of the variable [8], [9], [10]. Wiener’s definition of causal dependence depends on predictive power. Therefore, if predictive power can be associated with uncertainty, then causality can be measured by information entropy. However, information theory measurements are generally symmetric, such as mutual information [11].
Later, Kaiser and Schreiber proposed transfer entropy and demonstrated that it is more appropriate to quantify the dynamic relationship between time series data than mutual information [12]. Most importantly, the transfer entropy is asymmetric and based on the transfer probability, it naturally merges directional and dynamic information. The advantage of this information theory function is that it does not assume any particular model for the interaction between two related systems. Thus, compared with other model-based methods, transfer entropy has certain advantages in sensitivity to correlation coefficients. This is especially important when some unknown nonlinear interaction needs to be detected [13], [14], [15].
However, for some complex data, transfer entropy has a heavily computational burden and may produce spurious detection of causality. In particular, deterministic chaotic time series generated by nonlinear systems with higher dimensions and random processes have the characteristics of broadband power spectrum and long-term unpredictability [16]. In addition, most work assumes that data is discrete, while most time series data is real valued, such as log return in stock markets, medical and meteorological data [17]. Symbolization is an excellent way to solve this problem and has been applied in many fields. With the extensive application of symbolization, there are two issues in symbol time series analysis. First, for most systems, there is no common way to create the most appropriate partition, and generating partitions is a basic step of symbolization [18]. The other is the loss of information in the process of converting the original time series into symbol sequences. The symbolic transfer entropy has been proposed to solve the computational burden of transfer entropy but failed to overcome the shortage in symbolization.
So in this paper, we propose a novel measure, called dispersion transfer entropy(DTE), to determine the information flow and causality relation between systems through symbolization and transfer entropy. The two major issues in symbolic series have been solved by dynamically selecting parameters according to the Ragwitz criterion [19] and the use of dispersion patterns. The Ragwitz criterion is able to select the most appropriate parameters for optimal prediction of the future state. More specifically, the symbolic method meets the following criteria: space efficiency, time efficiency, and correctness of answer sets. In addition, we extend the DTE into the multivariate system and propose two kinds of multivariate transfer entropy, dispersion multivariate transfer entropy(DMTE) and dispersion partial transfer entropy(DPTE). The dispersion multivariate transfer entropy curve(DMTEC) is also proposed to demonstrate the information flow over time.
The rest of the paper is organized as follows. Second 2 provides an introduction of dispersion transfer entropy and its extensions. The numerical simulation is used to verify the efficiency of our methods in Section 3. Section 4 shows the application in stock markets. Finally, a summary is given in the last section.
Section snippets
Transfer entropy
Given two systems X and Y, and denote time series from systems X and Y. The Shannon entropy was introduced to measure the uncertainty and complexity of the system. Moreover, Shannon entropy also can show the average number of bits required for optimal coding of signal X without considering possible correlations, which is defined as follows
Schreiber proposed an information theoretic measure to detect asymmetry in the interaction of systems, called transfer
Numerical simulation on DTE
In order to determine the effectiveness of our study, we apply the dispersion transfer entropy(DTE) on unidirectional information flow system, coupled Henon maps, and bidirectional information flow system, the two-component ARFIMA. The two unidirectionally coupled Henon maps are defined as follows Schiff et al. [31].where X is the driving system and Y is the response system with coupling strength C ∈ [0, 1].
To minimize the effects
Empirical experiments
After confirming the effectiveness of our method, we apply the DTE and DMTE on the stock markets to detect the properties of financial data. All results below have been assessed the significance. We choose eleven stock markets and research the log return from January 1, 1990 to December 16, 2019, as the additivity of log return brings great convenience to computation and modeling. The stock markets are listed in Table. 1, we can divide them into three categories according to geographic
Conclusions
Transfer entropy plays an important role in the study of causal relationships and information transmission. However, for some complex data, it has a heavily computational burden and may produce spurious detection of causality. In this paper, we propose the dispersion transfer entropy(DTE) using symbolic analysis to solve the problem. Symbolization is a practical way to get higher efficiency of computation and stronger immunity to noise, but generating partition and information loss are two
Declaration of Competing Interest
None.
CRediT authorship contribution statement
Boyi Zhang: Conceptualization, Methodology, Software, Writing - original draft, Formal analysis, Investigation, Data curation, Writing - review & editing, Visualization. Pengjian Shang: Conceptualization, Methodology, Validation, Resources, Supervision, Project administration, Funding acquisition.
Acknowledgments
The financial supports from the funds of the Fundamental Research Funds for the Central Universities (2018JBZ104) and the National Natural Science Foundation of China (61771035) are gratefully acknowledged.
References (32)
Causality, cointegration, and control
J Econ Dyna Control
(1988)- et al.
Wiener–granger causality: a well established methodology
Neuroimage
(2011) - et al.
Towards a unifying approach to diversity measures: bridging the gap between the shannon entropy and rao’s quadratic index
Theor Popul Biol
(2006) - et al.
Symbolic time series analysis via wavelet-based partitioning
Signal Process
(2006) - et al.
Transfer entropy between multivariate time series
Commun Nonlinear Sci Numer Simul
(2017) - et al.
Modeling long-range cross-correlations in two-component arfima and fiarch processes
Physica A
(2008) The economy as an evolving complex system
(2018)Dynamics in action: intentional behavior as a complex system
Emergence
(2000)- et al.
What is a complex system?
Eur J Philos Sci
(2013) - et al.
Learning a theory of causality.
Psychol Rev
(2011)
Kernel method for nonlinear granger causality
Phys Rev Lett
Divergence measures based on the shannon entropy
IEEE Trans Inf Theory
Shannon entropy as a new measure of aromaticity, shannon aromaticity
PCCP
The relation between granger causality and directed information theory: a review
Entropy
Detecting nonlinearity in structural systems using the transfer entropy
Phys Rev E
Transfer entropya model-free measure of effective connectivity for the neurosciences
J Comput Neurosci
Cited by (6)
A novel and effective method for quantifying complexity of nonlinear time series
2024, Communications in Nonlinear Science and Numerical SimulationMeasurement of information transfer based on phase increment transfer entropy
2023, Chaos, Solitons and FractalsDetection and analysis of real-time anomalies in large-scale complex system
2021, Measurement: Journal of the International Measurement ConfederationCitation Excerpt :For example, Mao et al. [14] proposed a transfer entropy method for judging the causal relationship between multi-dimensional time series. Zhang et al. [15] used symbolization to propose the DTE (Dispersion Transfer Entropy) method to determine the causal relationship in complex systems in order to solve the computational burden of a traditional transfer entropy. Lindner et al. [16] applied transfer entropy to industrial fault propagation and analyzed the selection of various parameters in transfer entropy.
The identification of fractional order systems by multiscale multivariate analysis
2021, Chaos, Solitons and FractalsCitation Excerpt :There are various symbolic schemes, besides the methods based on range-partitioning, symbolization schemes based on first-order or higher-order differences in observations are also proposed. It is especially suitable when the studied data is not stable [19,20]. Bandt and Pompe have used this dynamic difference-based symbolic way in their permutation entropy and got excellent results [21].
Kendall transfer entropy: a novel measure for estimating information transfer in complex systems
2023, Journal of Neural EngineeringAn uncertainty measure based on Pearson correlation as well as a multiscale generalized Shannon-based entropy with financial market applications
2023, International Journal of Nonlinear Sciences and Numerical Simulation