Decreasing the function value in a single step

Please don't post Bullet support questions here, use the above forums instead.
Post Reply
onako
Posts: 1
Joined: Mon Feb 27, 2012 2:54 pm

Decreasing the function value in a single step

Post by onako »

Given certain function $f(X)$ which is quadratic in $X$,
in order to obtain its min, I set the derivative to $0$ to obtain

$$\nabla f(X) = AX-b $$ [1]

So, the solution to the following linear system

$$AX=b $$ [2]

with the unknown $X$ would give me the value that minimizes $f(X)$. Since $A$ is strictly diagonally dominant,
the system is solved by Jacobi iteration known to converge to exact solution in this case.

However, I'm interested if only $a$ $single$ iteration of Jacobi method on some
arbitrary initialization $X_0$ would yield result $X_1$ that satisfies $f(X_1)<f(X_0)$.

The Jacobi method is known to yield "progressively better results" to the linear system,
but I'm not sure what this implies precisely, and what implications it might have on the above optimization attempt.
I found that the Jacobi iterands $\{X_0, X_1, \dots, X_{k-1}, X_{k}\}$ satisfy

$$||X-X_k||_2 < ||X-X_{k-1}||_2 $$ [3]

where $X$ is the true solution to (2). Does the proof on the convergence of Jacobi actually imply (3)?
And, if so, does that imply progressively lower values of $f(X_k)$, $k\in\{0, 1, \dots, \}$?

Any other way (or suggestion) on proving $f(X_1)<f(X_0)$ is welcome. Well, the intuition tells me
I'm right, but that is not enough.
Post Reply