Why does `timestep` default to 0.002? comfortable ranges?

Discussion in 'Simulation' started by tor, Feb 22, 2018.

  1. tor

    tor

    Hi Emo:

    Why does `timestep` default to 0.002?
    Is it because `0.002` lies in most comfortable ranges where the time step is "just right" for a set of test models?
    And, how do you usually obtain such comfortable ranges for timestep?
    (I copied-pasted some words from http://www.mujoco.org/book/computation.html)

    I do appreciate your time.
    Thank you,
    tor
     
    Last edited: Feb 22, 2018
  2. Emo Todorov

    Emo Todorov Administrator Staff Member

    It has to default to something, and this is a reasonable choice for an "average" model. But for any given model, you may be able to get away with much larger time step (it is rare that you would need to make it smaller for stability reasons). It also depends on what you are doing - are you just simulating and want the best quality possible (in which case you should make it as small as possible while still running in real-tine) or are you sampling for the purpose of optimization (in which case you should make it as large as possible without going unstable). The value of 0.002 is closer to the former scenario; the simulation is usually stable with much larger time steps (say 0.01) but if you don't need to run faster than real-time, you might as well get all the accuracy you can.