|
25 | 25 | "\n", |
26 | 26 | "We want to hear your feedback and any issues you encounter!\n", |
27 | 27 | "\n", |
28 | | - "* [GitHub issues](https://github.com/anyscale/academy/issues)\n", |
29 | | - "* The [#tutorial channel](https://ray-distributed.slack.com/archives/C011ML23W5B) on the [Ray Slack](https://ray-distributed.slack.com)\n", |
| 28 | + "* The [#tutorial channel](https://ray-distributed.slack.com/archives/C011ML23W5B) on the [Ray Slack](https://ray-distributed.slack.com). [Click here](https://forms.gle/9TSdDYUgxYs8SA9e8) to join the Ray Slack.\n", |
30 | 29 | "* [Email](mailto:[email protected])\n", |
31 | 30 | "\n", |
| 31 | + "Find an issue? Please report it!\n", |
| 32 | + "\n", |
| 33 | + "* [GitHub issues](https://github.com/anyscale/academy/issues)\n", |
| 34 | + "\n", |
| 35 | + "### Troubleshooting\n", |
| 36 | + "\n", |
32 | 37 | "Troubleshooting tips are offered in known areas where you might encounter issues. All are summarized in the [_Troubleshooting, Tips, and Tricks notebook_](reference/Troubleshooting-Tips-Tricks.ipynb). For the details of the Ray API and libraries built on Ray, see the [Ray Docs](https://docs.ray.io/en/latest/).\n", |
33 | 38 | "\n", |
34 | 39 | "If you are new to Jupyter Lab and Jupyter Notebooks, see _Help > JupyterLab Reference_. The _Help_ menu also has references for Python and various libraries." |
|
53 | 58 | "* Accelerated model training with PyTorch: see [_Ray SGD_](#user-content-ray-sgd)\n", |
54 | 59 | "* Model serving: see [_Ray Serve_](#user-content-ray-serve)\n", |
55 | 60 | "\n", |
| 61 | + "If you are a _DevOps_ engineer interested in managing Ray clusters, see [_Ray Cluster Launcher_](#user-content-ray-cluster-launcher).\n", |
| 62 | + "\n", |
56 | 63 | "> **Note:** Older Ray tutorials can be found in the [this repo](https://github.com/ray-project/tutorial). They cover topics not yet available here." |
57 | 64 | ] |
58 | 65 | }, |
|
113 | 120 | "\n", |
114 | 121 | "The _crash course_ is intended to focus on learning the core API as quickly as possible, but using nontrivial examples. In contrast, the _Advanced Ray_ tutorial explores more advanced API usage, profiling and debugging applications, and how Ray works behind the scenes.\n", |
115 | 122 | "\n", |
116 | | - "| # | Lesson (Notebook) | Description |\n", |
| 123 | + "| | Lesson (Notebook) | Description |\n", |
117 | 124 | "| :- | :------------------------------------------------------------------------- | :---------------------------------------- |\n", |
118 | 125 | "| 00 | [Overview](ray-crash-course/00-Overview-Ray-Crash-Course.ipynb) | A _table of contents_ for this tutorial. |\n", |
119 | 126 | "| 01 | [Ray Tasks](ray-crash-course/01-Ray-Tasks.ipynb) | Understanding how Ray converts normal Python functions into distributed _stateless tasks_. |\n", |
|
139 | 146 | "\n", |
140 | 147 | "Go through the [_Crash Course_](#ray-crash-course) first if you are new to Ray. This tutorial provides a deeper dive into Ray tasks and actors, such as profiling and debugging applications. It also surveys the rest of the core API.\n", |
141 | 148 | "\n", |
142 | | - "| # | Lesson (Notebook) | Description |\n", |
| 149 | + "| | Lesson (Notebook) | Description |\n", |
143 | 150 | "| :- | :-------------------------------------------------------- | :---------------------------------------- |\n", |
144 | 151 | "| 00 | [Overview](advanced-ray/00-Overview-Advanced-Ray.ipynb) | A _table of contents_ for this tutorial. |\n", |
145 | 152 | "| 01 | [Ray Tasks Revisited](advanced-ray/01-Ray-Tasks-Revisited.ipynb) | More exploration of `ray.wait()` usage patterns, task dependencies and their management, and task profiling techniques. |\n", |
|
224 | 231 | "| [Extra: Application - Taxi](ray-rllib/explore-rllib/extras/Extra-Application-Taxi.ipynb) | Based on the `Taxi-v3` environment from OpenAI Gym. |\n", |
225 | 232 | "| [Extra: Application - Frozen Lake](ray-rllib/explore-rllib/extras/Extra-Application-Frozen-Lake.ipynb) | Based on the `FrozenLake-v0` environment from OpenAI Gym. |\n", |
226 | 233 | "\n", |
227 | | - "In addition, exercise solutions for this tutorial can be found [here](ray-rllib/explore-rllib/solutions/Ray-RLlib-Solutions.ipynb).\n", |
228 | | - "\n", |
229 | | - "For earlier versions of some of these tutorials, see [`rllib_exercises`](https://github.com/ray-project/tutorial/blob/master/rllib_exercises/rllib_colab.ipynb) in the original [github.com/ray-project/tutorial](https://github.com/ray-project/tutorial) project." |
| 234 | + "In addition, exercise solutions for this tutorial can be found [here](ray-rllib/explore-rllib/solutions/Ray-RLlib-Solutions.ipynb)." |
230 | 235 | ] |
231 | 236 | }, |
232 | 237 | { |
|
239 | 244 | "\n", |
240 | 245 | "_Ray Tune_ is Ray's system for _hyperparameter tuning_. This tutorial starts with an explanation of what hyperparameter tuning is for and the performances challenges doing it for many applications. Then the tutorial explores how to use _Tune_, how it integrates with several popular ML frameworks, and the algorithms supported in _Tune_.\n", |
241 | 246 | "\n", |
242 | | - "> **Note:** This tutorial will be released soon." |
| 247 | + "| | Lesson | Description |\n", |
| 248 | + "| :-- | :----- | :---------- |\n", |
| 249 | + "| 00 | [Ray Tune Overview](ray-tune/00-Ray-Tune-Overview.ipynb) | Overview of this tutorial. |\n", |
| 250 | + "| 01 | [Understanding Hyperparameter Tuning](ray-tune/01-Understanding-Hyperparameter-Tuning.ipynb) | An explanation of hyperparameters vs. parameters and a non-trivial example of hyperparameter tuning/optimization with Tune. |\n", |
| 251 | + "| 02 | [More Ray Tune with MNIST](ray-tune/02-More-Ray-Tune-with-MNIST.ipynb) | More exploration of the Tune API, using an MNIST example. |\n", |
| 252 | + "| 03 | [Search Algos and Schedulers](ray-tune/03-Search-Algos-and-Schedulers.ipynb) | Understanding the concepts of search algorithms and schedulers, again using an MNIST example. |\n", |
| 253 | + "| 04 | [Ray SGD](ray-tune/04-Ray-SGD.ipynb) | The new Ray SGD API and how to use it. |\n", |
| 254 | + "| | [Hyperparameter Tuning References](ray-tune/References-Hyperparameter-Tuning.ipynb) | Overview of this tutorial. |\n", |
| 255 | + "\n", |
| 256 | + "In addition, exercise solutions for this tutorial can be found in the `ray-tune/solutions` directory." |
243 | 257 | ] |
244 | 258 | }, |
245 | 259 | { |
|
248 | 262 | "source": [ |
249 | 263 | "### Ray SGD\n", |
250 | 264 | "\n", |
251 | | - "Directory: `ray-sgd`\n", |
252 | | - "\n", |
253 | | - "_Ray SGD_ is a tool to more easily exploit a cluster to perform training with _Stochastic Gradient Descent_ using PyTorch (TensorFlow support forthcoming).\n", |
| 265 | + "_Ray SGD_ is a tool to more easily exploit a cluster to perform training with _Stochastic Gradient Descent_ using PyTorch and TensorFlow.\n", |
254 | 266 | "\n", |
255 | | - "> **Note:** This tutorial will be released soon." |
| 267 | + "Currently, there is a single lesson for Ray SGD as part of the Ray Tune tutorial." |
256 | 268 | ] |
257 | 269 | }, |
258 | 270 | { |
|
263 | 275 | "\n", |
264 | 276 | "Directory: `ray-serve`\n", |
265 | 277 | "\n", |
266 | | - "_Ray Serve_ is Ray's system for scalable _model serving_, with capabilities that also make it suitable for other web server applications. This tutorial starts with an explanation of what's required in model serving, followed by a tour of the API with examples.\n", |
| 278 | + "_Ray Serve_ is Ray's system for scalable _model serving_, with capabilities that also make it suitable for other web server applications. This tutorial starts with an explanation of what's required in model serving, followed by a tour of the API with examples." |
| 279 | + ] |
| 280 | + }, |
| 281 | + { |
| 282 | + "cell_type": "markdown", |
| 283 | + "metadata": {}, |
| 284 | + "source": [ |
| 285 | + "### Ray Cluster Launcher\n", |
| 286 | + "\n", |
| 287 | + "Directory: `ray-cluster-launcher`\n", |
| 288 | + "\n", |
| 289 | + "When managing Ray clusters, you will want to use the _Ray Cluster Launcher_.\n", |
267 | 290 | "\n", |
268 | 291 | "> **Note:** This tutorial will be released soon." |
269 | 292 | ] |
|
0 commit comments