For simplicity, consider a nonlinear equation f(x) = 0 with Jacobian J(x). The Taylor expansion around x1 is:
f(x2) = f(x1) + J(x1) * (x2 - x1) + O((x2-x1)^2))
Newton's method tries to find x2 such that f(x2) == 0:
0 = f(x1) + J(x1) * (x2 - x1)
Solving for x2 we get:
x2 = x1 - inv(J(x1)) * f(x1)
Now we usually need to iterate for this to converge:
Code: Select all
while norm(f(x)) > tol
compute f and J
d = inv(J) * f
x = x - d
end
Let's try to reduce the cost of Newton's method by using GS to approximately invert J:
Code: Select all
while norm(f(x)) > tol1
compute f and J
d = 0
while norm(J*d - f) > tol2
perform GS iterations to solve J*d = f
end
end
So can we combine the two loops? With GS we can:
Code: Select all
compute J
while norm(f(x)) > tol
perform GS iterations to solve J*d = f, update x and f after each constraint is solved
end
Baumgarte ramping can be used like this:
Code: Select all
compute J
beta = 0.1
while norm(f(x)) > tol
perform NGS iterations on J*d = beta * f
beta = min(beta + 0.1, 1.0)
end
With this algorithm I can stack 8 square boxes at (30Hz, 5iter) and around 18 at (60Hz, 10iter). And I get accurate reaction forces and friction because the velocity constraints are solved cleanly. Most of the time there is just 1 position correction iteration.