{"cells": [{"cell_type": "markdown", "id": "b2dec494", "metadata": {"papermill": {"duration": 0.016555, "end_time": "2022-06-13T15:31:48.373430", "exception": false, "start_time": "2022-06-13T15:31:48.356875", "status": "completed"}, "tags": []}, "source": ["\n", "# Finetuning Scheduler\n", "\n", "* **Author:** [Dan Dale](https://github.com/speediedan)\n", "* **License:** CC BY-SA\n", "* **Generated:** 2022-06-13T17:31:27.948986\n", "\n", "This notebook introduces the [Finetuning Scheduler](https://finetuning-scheduler.readthedocs.io/en/stable/index.html) extension\n", "and demonstrates the use of it to finetune a small foundational model on the\n", "[RTE](https://huggingface.co/datasets/viewer/?dataset=super_glue&config=rte) task of\n", "[SuperGLUE](https://super.gluebenchmark.com/) with iterative early-stopping defined according to a user-specified\n", "schedule. It uses Hugging Face's ``datasets`` and ``transformers`` libraries to retrieve the relevant benchmark data\n", "and foundational model weights. The required dependencies are installed via the finetuning-scheduler ``[examples]`` extra.\n", "\n", "\n", "---\n", "Open in [{height=\"20px\" width=\"117px\"}](https://colab.research.google.com/github/PytorchLightning/lightning-tutorials/blob/publication/.notebooks/lightning_examples/finetuning-scheduler.ipynb)\n", "\n", "Give us a \u2b50 [on Github](https://www.github.com/PytorchLightning/pytorch-lightning/)\n", "| Check out [the documentation](https://pytorch-lightning.readthedocs.io/en/stable/)\n", "| Join us [on Slack](https://www.pytorchlightning.ai/community)"]}, {"cell_type": "markdown", "id": "2f9d0607", "metadata": {"papermill": {"duration": 0.01088, "end_time": "2022-06-13T15:31:48.395609", "exception": false, "start_time": "2022-06-13T15:31:48.384729", "status": "completed"}, "tags": []}, "source": ["## Setup\n", "This notebook requires some packages besides pytorch-lightning."]}, {"cell_type": "code", "execution_count": 1, "id": "1e79a2d8", "metadata": {"colab": {}, "colab_type": "code", "execution": {"iopub.execute_input": "2022-06-13T15:31:48.419871Z", "iopub.status.busy": "2022-06-13T15:31:48.419025Z", "iopub.status.idle": "2022-06-13T15:31:52.693199Z", "shell.execute_reply": "2022-06-13T15:31:52.692233Z"}, "id": "LfrJLKPFyhsK", "lines_to_next_cell": 0, "papermill": {"duration": 4.288856, "end_time": "2022-06-13T15:31:52.695457", "exception": false, "start_time": "2022-06-13T15:31:48.406601", "status": "completed"}, "tags": []}, "outputs": [], "source": ["! pip install --quiet \"ipython[notebook]\" \"torch>=1.8\" \"setuptools==59.5.0\" \"pytorch-lightning>=1.4\" \"hydra-core>=1.1.0\" \"finetuning-scheduler[examples]\" \"torchmetrics>=0.7\""]}, {"cell_type": "markdown", "id": "68ff7058", "metadata": {"papermill": {"duration": 0.011277, "end_time": "2022-06-13T15:31:52.718548", "exception": false, "start_time": "2022-06-13T15:31:52.707271", "status": "completed"}, "tags": []}, "source": ["## Scheduled Finetuning with the Finetuning Scheduler Extension\n", "\n", "{height=\"58px\" width=\"401px\"}\n", "\n", "The [Finetuning Scheduler](https://finetuning-scheduler.readthedocs.io/en/stable/index.html) extension accelerates and enhances model experimentation with flexible finetuning schedules.\n", "\n", "Training with the extension is simple and confers a host of benefits:\n", "\n", "- it dramatically increases finetuning flexibility\n", "- expedites and facilitates exploration of model tuning dynamics\n", "- enables marginal performance improvements of finetuned models\n", "\n", "Setup is straightforward, just install from PyPI! Since this notebook-based example requires a few additional packages (e.g.\n", "``transformers``, ``sentencepiece``), we installed the ``finetuning-scheduler`` package with the ``[examples]`` extra above.\n", "Once the ``finetuning-scheduler`` package is installed, the [FinetuningScheduler](https://finetuning-scheduler.readthedocs.io/en/stable/api/finetuning_scheduler.fts.html#finetuning_scheduler.fts.FinetuningScheduler) callback is available for use with PyTorch Lightning.\n", "For additional installation options, please see the Finetuning Scheduler [README](https://github.com/speediedan/finetuning-scheduler/blob/main/README.md).\n", "\n", "\n", "\n", "