{"cells": [{"cell_type": "markdown", "id": "8df0a60d", "metadata": {"papermill": {"duration": 0.017977, "end_time": "2023-01-05T11:03:35.603891", "exception": false, "start_time": "2023-01-05T11:03:35.585914", "status": "completed"}, "tags": []}, "source": ["\n", "# Fine-Tuning Scheduler\n", "\n", "* **Author:** [Dan Dale](https://github.com/speediedan)\n", "* **License:** CC BY-SA\n", "* **Generated:** 2023-01-05T12:03:14.890703\n", "\n", "This notebook introduces the [Fine-Tuning Scheduler](https://finetuning-scheduler.readthedocs.io/en/stable/index.html) extension\n", "and demonstrates the use of it to fine-tune a small foundational model on the\n", "[RTE](https://huggingface.co/datasets/viewer/?dataset=super_glue&config=rte) task of\n", "[SuperGLUE](https://super.gluebenchmark.com/) with iterative early-stopping defined according to a user-specified\n", "schedule. It uses Hugging Face's ``datasets`` and ``transformers`` libraries to retrieve the relevant benchmark data\n", "and foundational model weights. The required dependencies are installed via the finetuning-scheduler ``[examples]`` extra.\n", "\n", "\n", "---\n", "Open in [{height=\"20px\" width=\"117px\"}](https://colab.research.google.com/github/PytorchLightning/lightning-tutorials/blob/publication/.notebooks/lightning_examples/finetuning-scheduler.ipynb)\n", "\n", "Give us a \u2b50 [on Github](https://www.github.com/Lightning-AI/lightning/)\n", "| Check out [the documentation](https://pytorch-lightning.readthedocs.io/en/stable/)\n", "| Join us [on Slack](https://www.pytorchlightning.ai/community)"]}, {"cell_type": "markdown", "id": "f0efb864", "metadata": {"papermill": {"duration": 0.010473, "end_time": "2023-01-05T11:03:35.625389", "exception": false, "start_time": "2023-01-05T11:03:35.614916", "status": "completed"}, "tags": []}, "source": ["## Setup\n", "This notebook requires some packages besides pytorch-lightning."]}, {"cell_type": "code", "execution_count": 1, "id": "0ce32894", "metadata": {"colab": {}, "colab_type": "code", "execution": {"iopub.execute_input": "2023-01-05T11:03:35.647468Z", "iopub.status.busy": "2023-01-05T11:03:35.647212Z", "iopub.status.idle": "2023-01-05T11:03:39.400418Z", "shell.execute_reply": "2023-01-05T11:03:39.399069Z"}, "id": "LfrJLKPFyhsK", "lines_to_next_cell": 0, "papermill": {"duration": 3.767733, "end_time": "2023-01-05T11:03:39.403324", "exception": false, "start_time": "2023-01-05T11:03:35.635591", "status": "completed"}, "tags": []}, "outputs": [{"name": "stdout", "output_type": "stream", "text": ["\u001b[33mWARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv\u001b[0m\u001b[33m\r\n", "\u001b[0m"]}], "source": ["! pip install --quiet \"pytorch-lightning>=1.4, <1.9\" \"finetuning-scheduler[examples]>=0.3.0\" \"ipython[notebook]>=8.0.0, <8.9.0\" \"datasets<2.8.0\" \"torch>=1.8.1, <1.14.0\" \"setuptools==65.6.3\" \"torchmetrics>=0.7, <0.12\""]}, {"cell_type": "markdown", "id": "87d5672a", "metadata": {"papermill": {"duration": 0.010232, "end_time": "2023-01-05T11:03:39.429264", "exception": false, "start_time": "2023-01-05T11:03:39.419032", "status": "completed"}, "tags": []}, "source": ["## Scheduled Fine-Tuning with the Fine-Tuning Scheduler Extension\n", "\n", "{height=\"55px\" width=\"401px\"}\n", "\n", "The [Fine-Tuning Scheduler](https://finetuning-scheduler.readthedocs.io/en/stable/index.html) extension accelerates and enhances model experimentation with flexible fine-tuning schedules.\n", "\n", "Training with the extension is simple and confers a host of benefits:\n", "\n", "- it dramatically increases fine-tuning flexibility\n", "- expedites and facilitates exploration of model tuning dynamics\n", "- enables marginal performance improvements of fine-tuned models\n", "\n", "Setup is straightforward, just install from PyPI! Since this notebook-based example requires a few additional packages (e.g.\n", "``transformers``, ``sentencepiece``), we installed the ``finetuning-scheduler`` package with the ``[examples]`` extra above.\n", "Once the ``finetuning-scheduler`` package is installed, the [FinetuningScheduler](https://finetuning-scheduler.readthedocs.io/en/stable/api/finetuning_scheduler.fts.html#finetuning_scheduler.fts.FinetuningScheduler) callback is available for use with PyTorch Lightning.\n", "For additional installation options, please see the Fine-Tuning Scheduler [README](https://github.com/speediedan/finetuning-scheduler/blob/main/README.md).\n", "\n", "\n", "\n", "