Replies: 2 comments
-
|
Hi, self.load_policy(
assets_root_path + "/Isaac/Samples/Policies/Spot_Policies/spot_policy.pt",
assets_root_path + "/Isaac/Samples/Policies/Spot_Policies/spot_env.yaml",
)Let me know if you succeed in applying this approach. Cheers |
Beta Was this translation helpful? Give feedback.
-
|
Hi, I made the policy work in IsaacSim using the Spot example as mentioned before.
./isaaclab.sh -p scripts/reinforcement_learning/rsl_rl/train.py --task Isaac-Velocity-Flat-Unitree-Go2-v0 --headless --max_iterations 2000
./isaaclab.sh -p scripts/reinforcement_learning/rsl_rl/play.py --task Isaac-Velocity-Flat-Unitree-Go2-v0 --num_envs 1
self.load_policy(
"[YOUR_PATH_TO]/policy.pt",
"[YOUR_PATH_TO]/env.yaml"
)
self._action_scale = 0.25
self.world = World(stage_units_in_meters=1.0, physics_dt= 1/200, rendering_dt = 1/50)
self._input_keyboard_mapping = {
# forward command
"NUMPAD_8": [1.0, 0.0, 0.0],
"UP": [1.0, 0.0, 0.0],
# back command
"NUMPAD_2": [-1.0, 0.0, 0.0],
"DOWN": [-1.0, 0.0, 0.0],
# left command
"NUMPAD_6": [0.0, -1.0, 0.0],
"RIGHT": [0.0, -1.0, 0.0],
# right command
"NUMPAD_4": [0.0, 1.0, 0.0],
"LEFT": [0.0, 1.0, 0.0],
# yaw command (positive)
"NUMPAD_7": [0.0, 0.0, 1.0],
"N": [0.0, 0.0, 1.0],
# yaw command (negative)
"NUMPAD_9": [0.0, 0.0, -1.0],
"M": [0.0, 0.0, -1.0],
}Hope this help. Find the training results attached go2_policy.zip Cheers, |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Does anyone have a well trained policy for the Unitree Go2?
I am hoping to use the Unitree Go2, in a similar extension to that of the Quadruped extension in the Robotics examples, for the Spot Robot from Boston Dynamics. This robot comes pre-trained and you can start working with it straight away. But for the Unitree Go2 I have to run through the reinforcement training tutorials, to get a policy, and regardless of how long I train or how I fine tune the parameters, I have never been completely happy with the results.
Noting that the kinematic training isn’t even my area of work, I just seem to be wasting hours trying to get a start point for a well known and often used robot.
Many thanks in advance
Beta Was this translation helpful? Give feedback.
All reactions