{"cells": [{"cell_type": "markdown", "id": "43e3a29f", "metadata": {"papermill": {"duration": 0.017122, "end_time": "2023-03-15T10:45:06.228120", "exception": false, "start_time": "2023-03-15T10:45:06.210998", "status": "completed"}, "tags": []}, "source": ["\n", "# Fine-Tuning Scheduler\n", "\n", "* **Author:** [Dan Dale](https://github.com/speediedan)\n", "* **License:** CC BY-SA\n", "* **Generated:** 2023-03-15T10:44:42.950992\n", "\n", "This notebook introduces the [Fine-Tuning Scheduler](https://finetuning-scheduler.readthedocs.io/en/stable/index.html) extension\n", "and demonstrates the use of it to fine-tune a small foundation model on the\n", "[RTE](https://huggingface.co/datasets/viewer/?dataset=super_glue&config=rte) task of\n", "[SuperGLUE](https://super.gluebenchmark.com/) with iterative early-stopping defined according to a user-specified\n", "schedule. It uses Hugging Face's ``datasets`` and ``transformers`` libraries to retrieve the relevant benchmark data\n", "and foundation model weights. The required dependencies are installed via the finetuning-scheduler ``[examples]`` extra.\n", "\n", "\n", "---\n", "Open in [{height=\"20px\" width=\"117px\"}](https://colab.research.google.com/github/PytorchLightning/lightning-tutorials/blob/publication/.notebooks/lightning_examples/finetuning-scheduler.ipynb)\n", "\n", "Give us a \u2b50 [on Github](https://www.github.com/Lightning-AI/lightning/)\n", "| Check out [the documentation](https://pytorch-lightning.readthedocs.io/en/stable/)\n", "| Join us [on Slack](https://www.pytorchlightning.ai/community)"]}, {"cell_type": "markdown", "id": "7bd4e559", "metadata": {"papermill": {"duration": 0.010119, "end_time": "2023-03-15T10:45:06.248897", "exception": false, "start_time": "2023-03-15T10:45:06.238778", "status": "completed"}, "tags": []}, "source": ["## Setup\n", "This notebook requires some packages besides pytorch-lightning."]}, {"cell_type": "code", "execution_count": 1, "id": "a9524551", "metadata": {"colab": {}, "colab_type": "code", "execution": {"iopub.execute_input": "2023-03-15T10:45:06.272368Z", "iopub.status.busy": "2023-03-15T10:45:06.271551Z", "iopub.status.idle": "2023-03-15T10:45:10.556894Z", "shell.execute_reply": "2023-03-15T10:45:10.555484Z"}, "id": "LfrJLKPFyhsK", "lines_to_next_cell": 0, "papermill": {"duration": 4.300809, "end_time": "2023-03-15T10:45:10.559773", "exception": false, "start_time": "2023-03-15T10:45:06.258964", "status": "completed"}, "tags": []}, "outputs": [{"name": "stdout", "output_type": "stream", "text": ["\u001b[33mWARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv\u001b[0m\u001b[33m\r\n", "\u001b[0m"]}], "source": ["! pip install --quiet \"setuptools==67.4.0\" \"lightning>=2.0.0rc0\" \"ipython[notebook]>=8.0.0, <8.12.0\" \"datasets<2.8.0\" \"torch>=1.8.1, <1.14.0\" \"pytorch-lightning>=1.4, <2.0.0\" \"finetuning-scheduler[examples]>=0.4.0\" \"torchmetrics>=0.7, <0.12\""]}, {"cell_type": "markdown", "id": "464d6be9", "metadata": {"papermill": {"duration": 0.044229, "end_time": "2023-03-15T10:45:10.619512", "exception": false, "start_time": "2023-03-15T10:45:10.575283", "status": "completed"}, "tags": []}, "source": ["## Scheduled Fine-Tuning with the Fine-Tuning Scheduler Extension\n", "\n", "{height=\"55px\" width=\"401px\"}\n", "\n", "The [Fine-Tuning Scheduler](https://finetuning-scheduler.readthedocs.io/en/stable/index.html) extension accelerates and enhances model experimentation with flexible fine-tuning schedules.\n", "\n", "Training with the extension is simple and confers a host of benefits:\n", "\n", "- it dramatically increases fine-tuning flexibility\n", "- expedites and facilitates exploration of model tuning dynamics\n", "- enables marginal performance improvements of fine-tuned models\n", "\n", "Setup is straightforward, just install from PyPI! Since this notebook-based example requires a few additional packages (e.g.\n", "``transformers``, ``sentencepiece``), we installed the ``finetuning-scheduler`` package with the ``[examples]`` extra above.\n", "Once the ``finetuning-scheduler`` package is installed, the [FinetuningScheduler](https://finetuning-scheduler.readthedocs.io/en/stable/api/finetuning_scheduler.fts.html#finetuning_scheduler.fts.FinetuningScheduler) callback is available for use with PyTorch Lightning.\n", "For additional installation options, please see the Fine-Tuning Scheduler [README](https://github.com/speediedan/finetuning-scheduler/blob/main/README.md).\n", "\n", "\n", "\n", "