File tree Expand file tree Collapse file tree 2 files changed +3
-9
lines changed
docs/source/user_guide/model_training/distributed_training/ray Expand file tree Collapse file tree 2 files changed +3
-9
lines changed Original file line number Diff line number Diff line change @@ -179,8 +179,6 @@ Do a dry run to inspect how the yaml translates to Job and Job Runs
179179 ads opctl run -f train.yaml --dry-run
180180
181181
182- **Use ads opctl to create the cluster infrastructure and run the workload: **
183-
184182 .. include :: ../_test_and_submit.rst
185183
186184**Monitoring the workload logs **
Original file line number Diff line number Diff line change @@ -7,13 +7,9 @@ Ray is a framework for distributed computing in Python specialized in ML workloa
77The documentation shows how to create a container and ``yaml `` spec to run a ``Ray ``
88code sample in distributed modality.
99
10-
11- .. admonition :: Ray
12- :class: note
13-
14- ``Ray `` offers a core package to simply execute Python workloads in a distributed manner,
15- potentially across a cluster of machines (set up through ``Ray `` itself), but also other
16- extensions to perform more traditional ML computation, such as Hyperparameter Optimization.
10+ ``Ray `` offers a core package to simply execute Python workloads in a distributed manner,
11+ potentially across a cluster of machines (set up through ``Ray `` itself), but also other
12+ extensions to perform more traditional ML computation, such as Hyperparameter Optimization.
1713
1814
1915.. toctree ::
You can’t perform that action at this time.
0 commit comments