Replies: 1 comment
-
Thank you for posting this. Here's some guidance for migrating your domain randomization parameters from Isaac Gym (with 1. Randomizing Motor Strength (Torque Scaling) for ActuatorNetMLPIsaac Lab does not offer built-in per-environment randomization for motor strength within the actuator model configuration itself (including
Reference: See how actuator gains and limits are handled in isaaclab.actuators.ActuatorBase, which most actuator models inherit. The configuration does not natively support random per-env scaling, so you must implement runtime scaling externally. 2. Motor Offset Randomization SupportMotor offset randomization (i.e., adding a bias to the position targets of the actuators) is not directly a feature of actuator models, but you can insert this through the action processing pipeline:
Example: # action: (num_envs, action_dim)
random_offsets = torch.rand_like(action) * 0.04 - 0.02 # Uniform in [-0.02, 0.02]
action_with_offset = action + random_offsets
3. Implementing Action Lag / Random Lag Timesteps per EnvironmentIsaac Lab provides a buffer utility for this—use
Example API use (from the docs): from isaaclab.utils.buffers import DelayBuffer
buffer = DelayBuffer(history_length=MAX_LAG, batch_size=NUM_ENVS, device=device)
# At episode reset, for each environment:
buffer.set_time_lag(lag_timesteps_tensor) # can be a tensor of random integers in [0, MAX_LAG]
# When stepping:
delayed_actions = buffer.compute(new_actions)
Summary Table
Implementation Best Practices
Key Isaac Lab API References
Footnotes |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi everyone,
I’m migrating from Isaac Gym (spicifically walk-these-ways-go2 repo) to Isaac Lab and looking to replicate the following domain randomization parameters:
I’m using
ActuatorNetMLP
and would like to ask:Would appreciate any guidance on native support or best practices for implementing these in Isaac Lab.
Thanks!
Beta Was this translation helpful? Give feedback.
All reactions