Preconditioned parallel multisplitting USAOR method for H-matrices linear systems
Introduction
Sometimes, one has to solve a nonsingular linear system as where is nonsingular, b is n-dimensional vector. The basic iterative method for solving (1) is where , and M is nonsingular, so (2) can be written as where .
O'leary and White [12] presented the matrix multisplitting method in 1985 for parallelly solving the large sparse linear systems on the multiprocessor systems and it was further studied by many authors. See [1], [2], [6], [7], [8], [11], [14], [16]. Neumann and Plemmons [11] developed some more refined convergence results for one of the cases considered in [12]. Elsner [7] established the comparison theorems about the asymptotic convergence rate of this case. Frommer and Mayer [8] discussed two different variants of relaxed multisplitting methods. White [14] studied the convergence properties of the above matrix multisplitting methods for the symmetric positive definite matrix. Bai [1] studied the convergence domain of the matrix multisplitting relaxation methods. Cao and Liu [6] studied the convergence of two different variants of multisplitting relaxation methods with different weighting schemes. Zhang et al. [16] presented parallel multisplitting method for H-matrices. For parallel matrix multisplitting methods, a very good comprehensive survey is [2] and its references are worth reading. Bai et al. [3] discussed weak-convergence of the splitting methods for solving singular linear systems. In this paper, we will study the preconditioned multisplitting USAOR method for H-matrices linear systems and analyze its convergence theoretically.
The multisplitting method is as follows.
If A is a nonsingular n × n matrix, and satisfy:
- (1)
- (2)
Mk is nonsingular,
- (3)
Ek is a nonnegative diagonal matrix, and ,
The multisplitting method for solving (1) is we denote and , T is called the iteration matrix.
In paper [16], Zhang et al. presented the local relaxed parallel multisplitting method as follows.
Algorithm 1 (Local Relaxed Parallel Multisplitting Method)
Given an initial vector x0. For repeat (I) and (II), until convergence.
- (I)
For , solving yk:
- (II)
Computing:
Let , where I is an identity matrix, Lk are strictly lower triangular matrices, Uk are general matrices.
Now, we consider one kind of parallel multisplitting unsymmetric accelerated over-relaxation (USAOR) method called local relaxed parallel multisplitting unsymmetric accelerated over-relaxation method (LUSAOR).
Algorithm 1 associated with LUSAOR method can be written as where
In order to solve (1) fasterly, a nonsingular preconditioner P is introduced. The original systems (1) can be transformed into the preconditioned form then, we can define the basic iterative scheme: where is a splitting of PA, Mp is nonsingular.
Suppose that A has unit diagonal elements, and A is an H-matrix. In the literature, various authors have suggested different models of preconditioner P for linear systems (1). See [10], [15].
In this paper, we consider the preconditioner P as follows
In this paper, we will analyze the convergence of the preconditioned multisplitting USAOR method theoretically and give some comparison results of spectral radius. Finally, we provide one numerical example.
Section snippets
The preconditioned parallel multisplitting USAOR method
Let , where are strictly lower triangular matrices, are general matrices.
Then Algorithm 1 associated with preconditioned parallel multisplitting USAOR method (LPUSAOR) can be written as where
Preliminaries
We need the following results.
Lemma 1 [9]. A is an H-matrix if and only if there is a positive vector r such that 〈A〉r > 0, where .
Lemma 2 [5]. Let be an M-splitting of A, then if and only if A is an M-matrix.
Lemma 3 [5]. Let A and B be two n × n matrices with 0 ≤ B ≤ A, then ρ(B) ≤ ρ(A).
Lemma 4 [9]. If A is an H-matrix, then.
Lemma 5 [13]. Let A be a Z-matrix, then the following statements are equivalent:
A is an M-matrix. There is a positive vector x ∈ Rn such that Ax > 0. .
Lemma 6 [11].
Convergence result
In this section, we will present the convergence result of the preconditioned multisplitting USAOR method.
Theorem 1 LetA be an H-matrix with unit diagonal elements. If
for α1, βn for then PA is an H-matrix.
Proof Let
Since A is an H-matrix with unit diagonal elements, then 〈A〉 is an M-matrix and let
Comparison results of spectral radius
Let
where then the iteration matrix of parallel multisplitting USAOR method for 〈A〉 is as follows
Let
Numerical example
Consider the linear system where
By simple computation, we know that A is an M-matrix. Obviously, , then we have .
We take and , where , satisfy the conditions of Theorem 1.
Let , and determine and are the splittings of matrix A
Acknowledgments
The authors would like to thank the referees for their valuable comments and suggestions, which greatly improved the original version of this paper.
This work was supported by the National Natural Science Foundation of China (Grant no. 11001144) and the Natural Science Foundation of Shandong Province (no. ZR2012AL09).
References (16)
- et al.
A unified framework for the construction of various matrix multisplitting iterative methods for large sparse system of linear equations
Comput. Math. Appl.
(1996) - et al.
Weak convergence theory of quasi-nonnegative splittings for singular matrices
Appl. Numer. Math.
(2003) - et al.
Convergence of relaxed parallel multisplitting methods with different weighting schemes
Appl. Math. Comput.
(1999) - et al.
Convergence of relaxed parallel multisplitting methods
Linear Algebra Appl.
(1989) Two-sided bounds for the inverse of an H-matrix
Linear Algebra Appl.
(1995)- et al.
Convergence of parallel multisplitting iterative methods for M-matrices
Linear Algebra Appl.
(1987) - et al.
Comparison theorems of preconditioned Gauss–Seidel methods for M-matrices
Appl. Math. Comput.
(2012) - et al.
Convergence of relaxed multisplitting USAOR methods for H-matrices linear systems
Appl. Math. Comput.
(2008)
Cited by (3)
Large-Scale Regression: A Partition Analysis of the Least Squares Multisplitting
2020, IEEE Transactions on Instrumentation and MeasurementTowards solving massive regression problems: Least squares multisplitting
2017, I2MTC 2017 - 2017 IEEE International Instrumentation and Measurement Technology Conference, Proceedings