Page 1 of 1

Why is the constraint error scaled by the FPS?

Posted: Sun Nov 03, 2013 11:14 am
by structinf
Literally, every physics engine I have seen scales the constraint error by the FPS (1.0 / dt).

Code: Select all

btScalar k = info->fps * info->erp;
...
info->m_constraintError[i * skip] = k * (pivotBInW[i] - pivotAInW[i]);

Code: Select all

dReal k = info->fps * erp;
info->c[0] = k*j->contact.geom.depth;
However, from what I understand, the constraint error should be directly proportional to the time step. So, instead of k = fps * erp, k = dt * erp.

What am I missing here?

Re: Why is the constraint error scaled by the FPS?

Posted: Tue Nov 05, 2013 6:31 am
by RandyGaul
The bias term is a function of position and time. It is my understanding that the constraint error will decay over time if and only if your baumgarte term is as such: 0 < baumgarte < 2 / dt. The function oscillates, and as such will be over damped if baumgarte is less than 1 / dt.

You don't want your solution to converge with an oscillation, you want it to converge smoothly going straight towards the solution. As to why the bias isn't critically damped I imagine is because of the nature of an iterative solver: each constraint is solved in isolation and can affect the results of previously solved constraints.

As such I suppose it would be "safer" to over-damp your function in order to avoid the chance of oscillation. This is why you scale by the frequency of timestepping, and then go a little lower with the baumgarte term. This makes sense and explains, at least to me, why baumgarte converges with a horrid looking oscillation once error gets too large.

I say all of this with the disclaimer that I'm not totally sure if this is correct info, just my current understanding.

Re: Why is the constraint error scaled by the FPS?

Posted: Wed Nov 06, 2013 7:08 pm
by structinf
That actually made sense. Thank you!