What are the Jacobi and Gauss-Seidel method ?
Jacobi and Gauss — Seidel are iterative methods, which are used to solve linear systems of equations.
Why do we need it ?
For such a small 3x3 matrix like this one you may think that it would be much simpler to solve it with a more classical approach like Gaussian elimination. And you are absolutely right! But a big advantage is that these methods can create a nearly correct result in much less time, when working with gigantic matrices.
An example:
Assume we have a 7.000.000* 7.000.000 matrix. For example from some particle or gravity simulation in physics.
When using a classical approach, like the Gaussian elimination method, you have a runtime complexity Big O of 2n³/3. Whereas with Jacobi / Gauss Seidel you have n² * Iterations. ( (number of rows * number of columns) * Iterations)
Assuming we have a maximum number of 100.000 iterations we are saving over 2,2 * 10²⁰ Operations.
How does it work ?
A simple cook recipe:
- We are making an initial guess what our results could be. (Normally for x1,x2,…xn = 0)
- We are calculating new values for xₙ with the formula below
- We are calculating an Error comparing the result of our previous iteration with our new results.
- We are repeating steps 2–4 until our error is small enough.
Jacobi method:
To determine the error, you simply calculate the total deviation between xₙ from the first iteration and the second iteration (and so on for every further iteration).
Gauss-Seidel method:
The Gauss-Seidel method uses the same formula as Jacobi, but immediately updates the values xₙ in the same Iteration. Therefore it needs less iterations to calculate a good result, but you can’t parallelize the calculations in the same iterations, because each xₙ needs the results of the previous xₙ n=0,1,…n.
One important last point
Gauss-Seidel and Jacobi are not working for every matrix. In fact the given example in this article is not getting a final result. This is because it only works for matrices, which have a spectral radius that is less than 1. Another condition, which works most of the time, but not always is that the matrix is diagonally dominant.