You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/lerobot.md
+357-5Lines changed: 357 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -2,7 +2,9 @@
2
2
3
3
Let's run HuggingFace [`LeRobot`](https://github.com/huggingface/lerobot/) to train Transformer-based [action diffusion](https://diffusion-policy.cs.columbia.edu/) policies and [ACT](https://github.com/tonyzhaozh/act) onboard NVIDIA Jetson. These models learn to predict actions for a particular task from visual inputs and prior trajectories, typically collected during teleoperation or in simulation.
## Work with Real-World Robots - Before starting containers
35
+
36
+
This section gives the guide on how you can work through the LeRobot official example of [Getting Started with Real-World Robots \(`7_get_started_with_real_robot.md
37
+
`\)](https://github.com/huggingface/lerobot/blob/main/examples/7_get_started_with_real_robot.md) on your Jetson.
38
+
39
+
!!! tip
40
+
41
+
It's recommended to work on your Jetson in **monitor-attached** mode.
42
+
43
+
`lerobot` is designed to show camera view in windows and playback TTS audio while capturing dataset, so it is more convenient to setup your Jetson with its monitor (and speakers) attached to Jetson.d
44
+
45
+
### a. Check `jetson-container`'s location
46
+
47
+
Through out the course of all the workflows of `lerobot`, we will be generating a lot of data, especially for capturing dataset.
48
+
49
+
We will clone the `lerobot` directory on host and mount the directory in the container to keep all the data persistant, but first make sure your `jetson-containers` directory is placed on your SSD, not on your eMMC or microSD card.
50
+
51
+
If you have created the `jetson-containers` directory on eMMC or microSD card (likely the case if you first set up your Jetson device without SSD first and later added SSD), then use the `rsync` command to move the entire directory under SSD mount point.
As described above, we will setup the `lerobot` directory under `data` directory of `jetson-containers` for monting it inside the container so that generated data persist.
LeRobot's dataset capture flow (`control_robot.py`) utilizes **Speech Dispatcher** to use espeak TTS, in order to give operators audio queues for notifying the status and signaling the next operation. It's actually very helpful.
76
+
77
+
Speech Dispatcher utilizes Pulse Audio, so rather than just sharing the `/dev/snd` device when `docker run` (which is good for ALSA), we need to add the following arguments.
This is already added to `run.sh` of `jetson-containers`, however, we need to edit `/etc/pulse/default.pa` in order to allow the root user access to the socket file.
86
+
87
+
88
+
```bash
89
+
sudo vi /etc/pulse/default.pa
90
+
```
91
+
92
+
Find the section loading `module-native-protomocl-unix` and add `auth-anonymous=1`
Then restart PulseAudio service to make the config take effect.
103
+
104
+
```bash
105
+
pulseaudio --kill
106
+
pulseaudio --start
107
+
```
108
+
109
+
> For troubleshootings or details, please check the [`docs.md`](https://github.com/dusty-nv/jetson-containers/blob/dev/packages/speech/speech-dispatcher/docs.md) of `speech-dispatcher` package.
110
+
111
+
### d. Set udev rule for ACM devices
112
+
113
+
It is more convenient if the lerobot programs can always find the device of leader and follower arm with unique names.
114
+
115
+
For that, we set an udev rule so that arms always get assigned the same device name as following.<br>
116
+
This is first done on Jetson host side.
117
+
118
+
-`/dev/ttyACM_kochleader` : Leader arm
119
+
-`/dev/ttyACM_kochfollower` : Follower arm
120
+
121
+
First only connect the leader arm to Jetson and record the serial ID by running the following:
If you plan to use CSI cameras (not USB webcams) for data capture, you will use the new `--csi2webcam` options of `jetson-containers`, which exposes V4L2loopback devices that performs like USB webcams (MJPEG) for CSI cameras using Jetson's hardware JPEG encoder.
184
+
185
+
This feature require some packages to be installed.
## Work with Real-World Robots - Once in container
228
+
229
+
!!! tip "JupyerLab tip"
230
+
231
+
Inside the `lerobot` container, JupyterLab server process starts.
232
+
233
+
You can access with `http://localhost:8888/` (or `http://<IP_ADDRESS>:8888/` from other PC on the same network).
234
+
235
+
In the `notebooks`, there are some Jupyter notebooks for each segment of the official tutorial [Getting Started with Real-World Robots \(`7_get_started_with_real_robot.md`\)](https://github.com/huggingface/lerobot/blob/main/examples/7_get_started_with_real_robot.md).
236
+
237
+

238
+
239
+
Please note that some of them (like `notebooks/7-2_real-robot_configure-motors.ipynb`) can be used as a real work notebook to execute python codes and scritps convniently inside the notebook along with instructions (rather than switching to console).
240
+
241
+
However, keep in mind that you are encouraged to always check the [original official tutorial](https://github.com/huggingface/lerobot/blob/main/examples/7_get_started_with_real_robot.md), and some operation like training is much better executed on console.
242
+
243
+
!!! tip "Bash history tip"
244
+
245
+
Inside the container, on the console, you can press ++up++ key to scroll through some of the frequently used commands pre-registered in bash history.
246
+
247
+
### q. Setup audio
248
+
249
+
Check if PulseAudio is available.
250
+
251
+
```bash
252
+
pactl info
253
+
```
254
+
255
+
If you need to set the default audio output device, use `set-default-sink`.
256
+
257
+
```bash
258
+
pactl list short sinks
259
+
pactl set-default-sink [SINK_NAME_OR_INDEX]
260
+
```
261
+
262
+
### 1. Order and Assemble your Koch v1.1
263
+
264
+
You can order the Koch v1.1 kits from ROBOTIS. (*Note: they don't come with 3d printed parts*)
### 2. Configure motors, calibrate arms, teleoperate your Koch v1.1
275
+
276
+
Follow the Jupyter notebook `7-2_real-robot_configure-motors.ipynb`.
277
+
278
+
### 3. Record your Dataset and Visualize it
279
+
280
+
You should mostly operate on the container's terminal.
281
+
282
+
Follow the [official document's section](https://github.com/huggingface/lerobot/blob/main/examples/7_get_started_with_real_robot.md#3-record-your-dataset-and-visualize-it).
283
+
284
+
!!! tip "Camera config tip"
285
+
286
+
The official document demonstrates the two camera positions, one at the top ("phone") and the other at directly in front facing the arm ("laptop").
287
+
288
+
In our trials, this camera placement worked, but we needed to make the camera zoom-up to the scene so that they capture better spacial resolution.
289
+
290
+
Another thing worth experimenting is the **wrist cam**. More to come later.
291
+
292
+
!!! tip
293
+
294
+
If you plan to perfom training on a different machine, `scp` the dataset directory.
Follow the [official document's section](https://github.com/huggingface/lerobot/blob/main/examples/7_get_started_with_real_robot.md#4-train-a-policy-on-your-data).
313
+
314
+
!!! tip
315
+
316
+
Following commands are registered in Bash history inside the `lerobot` container.
If you perform the training on other Jetson or PC, `scp` the outputs directory content back to the orinal Jetson that has the leader and follower arm attached.
Follow the [official document's section](https://github.com/huggingface/lerobot/blob/main/examples/7_get_started_with_real_robot.md#3-record-your-dataset-and-visualize-it).
346
+
347
+
!!! tip "Tip for **a. Use `koch.yaml` and our `record` function**"
348
+
349
+
Modify the command in the bash history to add `-p` arugment to points to the policy checkpoint.
Outside of container, first launch the [rerun.io](https://rerun.io/) visualization tool that LeRobot uses <sup>[[↗]](https://github.com/huggingface/lerobot/?tab=readme-ov-file#visualize-datasets)</sup>
This will download and run a pre-trained [diffusion model](https://huggingface.co/lerobot/diffusion_pusht) on the [PushT](https://github.com/huggingface/gym-pusht) environment <sup>[[↗]](https://github.com/huggingface/lerobot/?tab=readme-ov-file#evaluate-a-pretrained-policy)</sup>
Next, train [ACT](https://github.com/tonyzhaozh/act) on the [Aloha](https://github.com/huggingface/gym-aloha) manipulation environment <sup>[[↗]](https://github.com/huggingface/lerobot/?tab=readme-ov-file#train-your-own-policy)</sup>
0 commit comments