Skip to content

Commit 0d7fa37

Browse files
Improve content at 4 anchor points for better app-link UX
- set-up-a-machine/overview.md #sbc-setup-instructions: replace generic tip box with direct links to supported SBC setup guides - data/capture-sync/capture-and-sync-data.md #supported-resources: add table of component types and capturable methods (verified against rdk/components/*/collectors.go) - train/train-a-model.md #upload-your-training-script: add upload steps with CLI command instead of bare pointer link - train/train-a-model.md #click-for-more-information-on-parsing-command- line-inputs: add the actual flags (--dataset_file, --model_output_directory, custom --key=value args) verified against rdk/cli/ml_training.go Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
1 parent 5efb39d commit 0d7fa37

3 files changed

Lines changed: 45 additions & 7 deletions

File tree

docs/data/capture-sync/capture-and-sync-data.md

Lines changed: 18 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -90,7 +90,24 @@ If you see data flowing in, capture and sync are working correctly.
9090

9191
## Supported resources {#supported-resources}
9292

93-
Any component that implements a Viam API method returning data can be configured for data capture. This includes cameras, sensors, movement sensors, encoders, and any modular component that returns readings. See the [data reference](/data/reference/) for the full list of supported methods and data types.
93+
Any built-in or modular component that implements a capturable method can be configured for data capture. The following table lists the most commonly captured component types and their methods:
94+
95+
| Component type | Capturable methods |
96+
|---|---|
97+
| Camera | `ReadImage`, `GetImages`, `NextPointCloud` |
98+
| Sensor | `GetReadings` |
99+
| Movement sensor | `GetPosition`, `GetLinearVelocity`, `GetAngularVelocity`, `GetCompassHeading`, `GetLinearAcceleration`, `GetOrientation`, `GetReadings` |
100+
| Encoder | `GetTicksCount` |
101+
| Motor | `GetPosition`, `IsPowered` |
102+
| Power sensor | `GetVoltage`, `GetCurrent`, `GetPower`, `GetReadings` |
103+
| Servo | `GetPosition` |
104+
| Arm | `GetEndPosition`, `GetJointPositions` |
105+
| Gantry | `GetPosition`, `GetLengths` |
106+
| Board | `Analogs`, `Gpios` |
107+
| Vision service | `CaptureAllFromCamera` |
108+
| SLAM service | `GetPosition`, `GetPointCloudMap` |
109+
110+
Every component and service also supports capturing `DoCommand` responses. Modular components that return readings through any of these methods are automatically capturable.
94111

95112
## Troubleshooting
96113

docs/set-up-a-machine/overview.md

Lines changed: 8 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -56,12 +56,16 @@ Use the **Platform you want to run on** dropdown to select the operating system
5656

5757
Options include Linux / Aarch64, Linux / x86, Mac, Windows native, Windows (WSL), Linux / Armv7l, and ESP32.
5858

59-
{{< alert title="Tip" color="tip" >}}
59+
If you're using a single-board computer, follow the setup guide for your board before continuing:
6060

61-
If you're using a single-board computer like a Raspberry Pi or NVIDIA Jetson, make sure it's running a compatible Linux OS before continuing.
62-
The setup page links to an [installation guide](/reference/device-setup/) for supported single-board computers.
61+
- [Raspberry Pi](/reference/device-setup/rpi-setup/)
62+
- [NVIDIA Jetson Nano](/reference/device-setup/jetson-nano-setup/)
63+
- [NVIDIA Jetson AGX Orin](/reference/device-setup/jetson-agx-orin-setup/)
64+
- [BeagleBone AI-64](/reference/device-setup/beaglebone-setup/)
65+
- [Orange Pi Zero 2](/reference/device-setup/orange-pi-zero2/)
66+
- [Orange Pi 3 LTS](/reference/device-setup/orange-pi-3-lts/)
6367

64-
{{< /alert >}}
68+
See [all supported boards](/reference/device-setup/) for the full list.
6569

6670
## 4. Select your installation method
6771

docs/train/train-a-model.md

Lines changed: 19 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -266,11 +266,28 @@ for _, job := range jobs {
266266

267267
## Upload your training script {#upload-your-training-script}
268268

269-
To write and upload a custom training script instead of using a built-in training type, see [Custom training scripts](/train/custom-training-scripts/#package-and-upload).
269+
To use your own training script instead of a built-in training type:
270+
271+
1. Package your script with a `setup.py` or `pyproject.toml`. Viam invokes it as `python3 -m model.training`.
272+
2. Upload with the CLI:
273+
274+
```sh {class="command-line" data-prompt="$"}
275+
viam training-script upload --path=<path-to-script> --org-id=<org-id> --script-name=<name>
276+
```
277+
278+
3. Submit a training job that references your script (from the web UI or CLI).
279+
280+
For the full walkthrough including local testing with Docker, see [Custom training scripts](/train/custom-training-scripts/).
270281

271282
## Parse command line inputs {#click-for-more-information-on-parsing-command-line-inputs}
272283

273-
For details on parsing command line arguments in custom training scripts, see [Write a training script](/train/custom-training-scripts/#trainingpy).
284+
When you submit a training job with custom arguments, Viam passes them to your training script as command-line flags. Your script receives:
285+
286+
- `--dataset_file`: path to the dataset manifest (default: `dataset.jsonl`)
287+
- `--model_output_directory`: where to write the trained model (fixed: `/model_output`)
288+
- Any custom arguments you defined, as `--key=value` pairs
289+
290+
Argument keys must be alphanumeric with underscores or hyphens. For full details on writing and testing training scripts, see [Custom training scripts](/train/custom-training-scripts/).
274291

275292
## What's next
276293

0 commit comments

Comments
 (0)