{"id":5649098,"date":"2023-10-17T18:37:27","date_gmt":"2023-10-17T22:37:27","guid":{"rendered":"https:\/\/lightning.ai\/pages\/?p=5649098"},"modified":"2023-10-17T18:42:15","modified_gmt":"2023-10-17T22:42:15","slug":"pl-tutorial-and-overview","status":"publish","type":"post","link":"https:\/\/lightning.ai\/pages\/blog\/pl-tutorial-and-overview\/","title":{"rendered":"PyTorch Lightning for Dummies &#8211; A Tutorial and Overview"},"content":{"rendered":"<div class=\"takeaways card-glow p-4 my-4\"><h3 class=\"w-100 d-block\">Takeaways<\/h3><br \/>\nYou\u2019ll learn to use PyTorch Lightning\u2019s Core API features by completing an applied project to train a Language Transformer written in PyTorch on the WikiText2 dataset.<\/p>\n<p><\/div>\n<p>The code in this tutorial is available on GitHub in the <a href=\"https:\/\/github.com\/JustinGoheen\/text-lab\" target=\"_blank\" rel=\"noopener\">text-lab<\/a> repo. Clone the repo and follow along!<\/p>\n<h2>Introduction<\/h2>\n<p>Training deep learning models at scale is an incredibly interesting and complex task. Reproducibility for projects is key, and reproducible code bases are exactly what we get when we leverage PyTorch Lightning for training and finetuning. An added benefit of using PyTorch Lightning is that the framework is domain agnostic and is complementary to PyTorch. Meaning \u2013 it does not replace PyTorch and we are enabled to train text, vision, audio, and multimodal models using the same framework \u2013 PyTorch Lightning.<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"aligncenter wp-image-5649101\" src=\"https:\/\/lightningaidev.wpengine.com\/wp-content\/uploads\/2023\/10\/PyTorch-Lightning-and-Fabric-1-1024x576.png\" alt=\"\" width=\"612\" height=\"344\" srcset=\"https:\/\/lightningaidev.wpengine.com\/wp-content\/uploads\/2023\/10\/PyTorch-Lightning-and-Fabric-1-1024x576.png 1024w, https:\/\/lightningaidev.wpengine.com\/wp-content\/uploads\/2023\/10\/PyTorch-Lightning-and-Fabric-1-300x169.png 300w, https:\/\/lightningaidev.wpengine.com\/wp-content\/uploads\/2023\/10\/PyTorch-Lightning-and-Fabric-1-1536x864.png 1536w, https:\/\/lightningaidev.wpengine.com\/wp-content\/uploads\/2023\/10\/PyTorch-Lightning-and-Fabric-1.png 1600w, https:\/\/lightningaidev.wpengine.com\/wp-content\/uploads\/2023\/10\/PyTorch-Lightning-and-Fabric-1-300x169@2x.png 600w\" sizes=\"(max-width: 612px) 100vw, 612px\" \/><\/p>\n<h3>The Research<\/h3>\n<p>Our research objective for this tutorial is to train a small language model using a Transformer on the WikiText2 dataset. Both the Transformer and the dataset are available to us in PyTorch Lightning at <code>pytorch_lightning.demos.transformer<\/code>. We\u2019ll see later how we can pull those into our Python module or Jupyter Notebook for use in our custom <code>LightningDataModule<\/code> and <code>LightningModule<\/code>.<\/p>\n<h2>PyTorch and PyTorch Lightning<\/h2>\n<p>PyTorch Lightning is not a replacement for PyTorch. Rather, PyTorch Lightning is an extension &#8211; a framework used to train models that have been implemented with PyTorch. This relationship is visualized in the following snippet.<\/p>\n<pre class=\"snippet-shortcode code-shortcode dark-theme collapse-false\"><code class=\"hljs language-python\">import pytorch_lightning as pl\r\n\r\nclass LabModule(pl.LightningModule):\r\n    def __init__(self, vocab_size: int = 33278):\r\n        super().__init__()\r\n        self.model = Transformer(vocab_size=vocab_size)<\/code><div class=\"copy-button\"><button class=\"expand-button active\">Expand<\/button><button class=\"copy\">Copy<\/button><\/div><\/pre>\n<p>When we create <code>self.model<\/code> as shown above, we often refer to <code>self.model<\/code> as the internal module. Let\u2019s keep reading to learn how to apply this interoperability between PyTorch and PyTorch Lightning!<\/p>\n<h2>PyTorch Lightning: The Core API<\/h2>\n<p>Okay \u2013 time to get to it! In the next sections, we will cover how to use the Core API of PyTorch Lightning. What is the Core API? First, let&#8217;s consider how we might organize the training steps of any deep learning project sequentially according to data processing, creating a model, and then training that model on the given dataset.<\/p>\n<p>These key steps\/attributes are exactly how the Core API is structured with <code>LightningDataModule<\/code>, <code>LightningModule<\/code>, and <code>Trainer<\/code>.<\/p>\n<h3>LightningDataModule<\/h3>\n<p><code>LightningDataModule<\/code> (LDM) wraps the data phase. It takes in a custom PyTorch <code>Dataset<\/code> and <code>DataLoader<\/code> which enables <code>Trainer<\/code> to handle data during training. If needed, LDM exposes the <code>setup<\/code> and <code>prepare_data<\/code> hooks in case you need additional customization. For the training phase, the PyTorch <code>DataLoader<\/code> has to be defined as <code>train_dataloader<\/code> and <code>val_dataloader<\/code>. The following code snippet is pseudocode (an example) of how to import <code>LightningDataModule<\/code> and use it to create a custom class.<\/p>\n<pre class=\"snippet-shortcode code-shortcode dark-theme collapse-false\"><code class=\"hljs language-python\">import pytorch_lightning as pl\r\n\r\nclass LabDataModule(pl.LightningDataModule):\r\n    def __init__(self):\r\n        super().__init__()<\/code><div class=\"copy-button\"><button class=\"expand-button active\">Expand<\/button><button class=\"copy\">Copy<\/button><\/div><\/pre>\n<p>We will see examples of creating <code>train_dataloader<\/code> and <code>val_dataloader<\/code> methods in LDM later in this tutorial.<\/p>\n<h2>LightningModule<\/h2>\n<p><code>LightningModule<\/code> is the main training interface with the previously mentioned PyTorch models referred to as \u2018internal modules\u2019. <code>LightningModule<\/code> itself is a custom <code>torch.nn.Module<\/code> that is extended with dozens of additional hooks like on_fit_start and on_fit_end. These hooks allow us better control of <code>Trainer\u2019s<\/code> flows and enables custom behaviors by overriding these hooks. The following snippet of pseudo-code shows how to import and use <code>LightningModule<\/code> to create a custom class.<\/p>\n<pre class=\"snippet-shortcode code-shortcode dark-theme collapse-false\"><code class=\"hljs language-python\">import pytorch_lightning as pl\r\n\r\nclass LabModule(pl.LightningModule):\r\n    def __init__(self):\r\n        super().__init__()<\/code><div class=\"copy-button\"><button class=\"expand-button active\">Expand<\/button><button class=\"copy\">Copy<\/button><\/div><\/pre>\n<h2>Trainer<\/h2>\n<p><code>Trainer<\/code> configures the training scope and manages the training loop with <code>LightningModule<\/code> and <code>LightningDataModule<\/code>. The simplest <code>Trainer<\/code> configuration is accomplished by setting flags like <code>devices<\/code>, <code>accelerator<\/code>, and <code>strategy<\/code> and by passing in our choice of <code>loggers<\/code>, <code>profilers<\/code>, <code>callbacks<\/code>, and <code>plugins<\/code>.<\/p>\n<pre class=\"snippet-shortcode code-shortcode dark-theme collapse-false\"><code class=\"hljs language-python\">import pytorch_lightning as pl\r\n\r\n# instantiate the trainer\r\ntrainer = pl.Trainer() \r\n\r\n# instantiate the datamodule\r\ndatamodule = LabDataModule() \r\n# instantiate the model\r\nmodel = LabModule() \r\n\r\n# call fit to start training\r\ntrainer.fit(model=model, datamodule=datamodule) <\/code><div class=\"copy-button\"><button class=\"expand-button active\">Expand<\/button><button class=\"copy\">Copy<\/button><\/div><\/pre>\n<p>Rather see this explained in a video? Sebastian Raschka, our Lead AI Educator, breaks down how to get started with structuring our PyTorch Code using PyTorch Lightning.<\/p>\n<p><iframe loading=\"lazy\" title=\"YouTube video player\" src=\"https:\/\/www.youtube.com\/embed\/D1aYLlbfC14?si=zal-Pm2rB3o__O24\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\"><\/iframe><\/p>\n<h2>Getting Started: Hands-on Coding<\/h2>\n<h3>Installing PyTorch Lightning<\/h3>\n<p>First, we will need to install PyTorch Lightning. We can further understand how closely integrated PyTorch Lightning is with PyTorch during the installation process. How? Simply by calling out that using the following command in the terminal to install PyTorch Lightning also installs PyTorch into our virtual environment. So let\u2019s go ahead and install PyTorch Lightning using the following command.<\/p>\n<pre class=\"snippet-shortcode code-shortcode dark-theme collapse-false\"><code class=\"hljs language-python\">pip install pytorch-lightning<\/code><div class=\"copy-button\"><button class=\"expand-button active\">Expand<\/button><button class=\"copy\">Copy<\/button><\/div><\/pre>\n<p>We also need to install TorchText in order to run the demo. Let\u2019s also do that by using the following command in the terminal.<\/p>\n<pre class=\"snippet-shortcode code-shortcode dark-theme collapse-false\"><code class=\"hljs language-python\">pip install torchtext<\/code><div class=\"copy-button\"><button class=\"expand-button active\">Expand<\/button><button class=\"copy\">Copy<\/button><\/div><\/pre>\n<p>Do you need help creating a virtual environment? There\u2019s a video for that too!<\/p>\n<p><iframe loading=\"lazy\" title=\"YouTube video player\" src=\"https:\/\/www.youtube.com\/embed\/WHWsABk4Ejk?si=zQ-ilrCj04AE1RkZ\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\"><\/iframe><\/p>\n<h3>The Custom LightningDataModule<\/h3>\n<p>The dataset we will use is WikiText2. The demo code available to us in PyTorch Lightning will automatically fetch WikiText2 for us \u2013 so there\u2019s no need to worry about downloading the dataset from torchtext.<\/p>\n<p>In the example below, WikiText2 is imported as <code>LabDataset<\/code>. If you wish to do so, you can check out the code used to create the custom PyTorch Dataset in <code>textlab.pipeline.dataset.py<\/code>. However, for the purposes of this tutorial, we can ignore that implementation for now. Once again, here\u2019s the pseudo-code for creating a <code>LightningDataModule<\/code> without adding any additional customization.<\/p>\n<pre class=\"snippet-shortcode code-shortcode dark-theme collapse-false\"><code class=\"hljs language-python\">import pytorch_lightning as pl\r\n\r\nclass LabDataModule(pl.LightningDataModule):\r\n    def __init__(self):\r\n        super().__init__()<\/code><div class=\"copy-button\"><button class=\"expand-button active\">Expand<\/button><button class=\"copy\">Copy<\/button><\/div><\/pre>\n<h4>Creating the Custom Class<\/h4>\n<p>Compared to the pseudo code example, we need to customize the <code>__init__<\/code> method further in order to enable random splitting of the dataset, and let the <code>LightningDataModule<\/code> know the data source. This is where we can also provide domain-specific arguments like block_size for datasets used in text problems, or image_size for vision problems.<\/p>\n<p>In the code blocks below we will use a custom extension for this project which inherits the PL basic classes for LightningDataModule. This custom class will be called <code>LabDataModule<\/code>.<\/p>\n<pre class=\"snippet-shortcode code-shortcode dark-theme collapse-false\"><code class=\"hljs language-python\">from pathlib import Path\r\n\r\nfrom torch.utils.data import DataLoader, random_split\r\n\r\nimport pytorch_lightning as pl\r\n\r\nfrom textlab import Config\r\nfrom textlab.pipeline import LabDataset\r\n\r\nclass LabDataModule(pl.LightningDataModule):\r\n    def __init__(\r\n        self,\r\n        num_workers: int = 2,\r\n        data_dir: Path =\"data\",\r\n        block_size: int = 35,\r\n        download: bool = True,\r\n        train_size: float = 0.8,\r\n    ):\r\n        super().__init__()\r\n        self.data_dir = data_dir\r\n        self.block_size = block_size\r\n        self.download = download\r\n        self.num_workers = num_workers\r\n        self.train_size = train_size\r\n        self.dataset = None<\/code><div class=\"copy-button\"><button class=\"expand-button active\">Expand<\/button><button class=\"copy\">Copy<\/button><\/div><\/pre>\n<h4>Preparing the Data<\/h4>\n<p>The prepare_data method will be called first by <code>Trainer<\/code>. When this happens, the dataset will either download or be fetched automatically from the data directory cache. In particular, this is important for multi-node training when each node needs to have its own copy for the training dataset.<\/p>\n<pre class=\"snippet-shortcode code-shortcode dark-theme collapse-false\"><code class=\"hljs language-python\">    def prepare_data(self):\r\n        self.dataset = LabDataset(\r\n            data_dir=self.data_dir,\r\n            block_size=self.block_size,\r\n            download=self.download\r\n    )\r\n<\/code><div class=\"copy-button\"><button class=\"expand-button active\">Expand<\/button><button class=\"copy\">Copy<\/button><\/div><\/pre>\n<h4>Setting Up the Data Splits<\/h4>\n<p>The setup method is used by <code>Trainer<\/code> while setting up the training process on each used device\/GPU. In this hook, we need to create the <code>train_data<\/code>, <code>val_data<\/code>, and <code>test_data<\/code> splits for the entire dataset and refer to them as attributes of <code>LabDataModule<\/code> (these splits will be later passed to the Dataloaders).<\/p>\n<pre class=\"snippet-shortcode code-shortcode dark-theme collapse-false\"><code class=\"hljs language-python\">    def setup(self, stage: str):\r\n        if stage == \"fit\" or stage is None:\r\n            train_size = int(len(self.dataset) * self.train_size)\r\n            test_size = len(self.dataset) - train_size\r\n            self.train_data, self.val_data = random_split(self.dataset, lengths=[train_size, test_size])\r\n        if stage == \"test\" or stage is None:\r\n            self.test_data = self.val_data<\/code><div class=\"copy-button\"><button class=\"expand-button active\">Expand<\/button><button class=\"copy\">Copy<\/button><\/div><\/pre>\n<h4>The Train, Validation, and Test DataLoaders<\/h4>\n<p>As implied above, Trainer does not access <code>train_data<\/code>, <code>val_data<\/code>, or <code>test_data<\/code> directly, but instead, needs to be fed by batching data with PyTorch <code>DataLoaders<\/code>.<\/p>\n<pre class=\"snippet-shortcode code-shortcode dark-theme collapse-false\"><code class=\"hljs language-python\">    def train_dataloader(self):\r\n        return DataLoader(self.train_data, num_workers=self.num_workers)\r\n\r\n    def val_dataloader(self):\r\n        return DataLoader(self.val_data, num_workers=self.num_workers)\r\n\r\n    def test_dataloader(self):\r\n        return DataLoader(self.test_data, num_workers=self.num_workers)<\/code><div class=\"copy-button\"><button class=\"expand-button active\">Expand<\/button><button class=\"copy\">Copy<\/button><\/div><\/pre>\n<p>Now that we have handled our data phase by creating <code>LabDataModule<\/code>, let\u2019s move onto our model phase by creating a custom <code>LightingModule<\/code> around our Transformer internal module.<\/p>\n<p>Need a video break? Here\u2019s an awesome summary by Sebastian on Organizing Your Data Loaders with Data Modules.<\/p>\n<p><iframe loading=\"lazy\" title=\"YouTube video player\" src=\"https:\/\/www.youtube.com\/embed\/ejXYUte4q3U?si=uyFOPFuWWiyHDX7J\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\"><\/iframe><\/p>\n<h3>The Custom LightningModule<\/h3>\n<h4>Creating the Language Model with PyTorch<\/h4>\n<p>The demo Transformer from PyTorch Lightning is implemented with PyTorch. The details of the implementation are outside of the scope of this post. For now, let&#8217;s just remember the previously discussed concept of <code>torch.nn.Modules<\/code> as internal modules that will be used in <code>LightningModules<\/code> as <code>self.model<\/code>.<\/p>\n<p>Curious about what the Transformer implementation looks like? You can <a href=\"https:\/\/github.com\/Lightning-AI\/lightning\/blob\/master\/src\/lightning\/pytorch\/demos\/transformer.py\" target=\"_blank\" rel=\"noopener\">check it out on GitHub<\/a>!<\/p>\n<h4>Creating the Custom Class<\/h4>\n<p>Just as we further customized <code>LabDataModule<\/code>, we also need to customize our <code>LightningModule<\/code> in the class shown below. This new class, <code>LabModule<\/code>, is our custom <code>LightningModule<\/code> that will interface with <code>Trainer<\/code> and train the internal module with the hooks that we cover in the next few sections.<\/p>\n<p>Below we can see that we have added forward and <code>training_step<\/code> in addition to <code>__init__<\/code>. The forward method calls the internal module and will cause the Transformer to go through its training process using what we refer to as autograd in the PyTorch world. Autograd is beyond the scope of this post. However, if you\u2019d like to learn more \u2013 you can read this <a href=\"https:\/\/pytorch.org\/tutorials\/beginner\/blitz\/autograd_tutorial.html\" target=\"_blank\" rel=\"noopener\">Gentle Introduction to torch.autograd<\/a> by the PyTorch team.<\/p>\n<pre class=\"snippet-shortcode code-shortcode dark-theme collapse-false\"><code class=\"hljs language-python\">class LabModule(pl.LightningModule):\r\n    def __init__(self, vocab_size: int = 33278):\r\n        super().__init__()\r\n        self.model = Transformer(vocab_size=vocab_size)\r\n\r\n    def forward(self, inputs, target):\r\n        return self.model(inputs, target)\r\n\r\n    def training_step(self, batch, batch_idx):\r\n        inputs, target = batch\r\n        output = self(inputs, target)\r\n        loss = torch.nn.functional.nll_loss(output, target.view(-1))\r\n        return loss<\/code><div class=\"copy-button\"><button class=\"expand-button active\">Expand<\/button><button class=\"copy\">Copy<\/button><\/div><\/pre>\n<h4>The training_step Method<\/h4>\n<p>The primary interaction between a <code>LightningModule<\/code> and <code>Trainer<\/code> happens via the <code>training_step<\/code> method of the <code>LightningModule<\/code>. This method will call the internal module and calculate a loss function needed for model optimization. In particular, the method sources batched inputs, passes the batch to the model, and collects an output\/response. Then, it calculates and returns the loss that will be used by Trainer to update the gradients.<\/p>\n<pre class=\"snippet-shortcode code-shortcode dark-theme collapse-false\"><code class=\"hljs language-python\">    def training_step(self, batch, batch_idx):\r\n        inputs, target = batch\r\n        output = self(inputs, target)\r\n        loss = torch.nn.functional.nll_loss(output, target.view(-1))\r\n        self.log(\"training-loss\", loss)\r\n        return loss<\/code><div class=\"copy-button\"><button class=\"expand-button active\">Expand<\/button><button class=\"copy\">Copy<\/button><\/div><\/pre>\n<h4>The validation_step and test_step Methods<\/h4>\n<p>These validation and test methods are similar to <code>training_step<\/code> except that no loss is returned. And, if we were to use the <code>EarlyStopping<\/code> callback in the <code>validation_step<\/code>, we\u2019d monitor the loss that is calculated during the <code>validation_step<\/code> to stop training if no improvement is observed.<\/p>\n<pre class=\"snippet-shortcode code-shortcode dark-theme collapse-false\"><code class=\"hljs language-python\">    def validation_step(self, batch, batch_idx):\r\n        inputs, target = batch\r\n        output = self(inputs, target)\r\n        loss = torch.nn.functional.nll_loss(output, target.view(-1))\r\n        self.log(\"val-loss\", loss)\r\n\r\n    def test_step(self, batch, batch_idx):\r\n        inputs, target = batch\r\n        output = self(inputs, target)\r\n        loss = torch.nn.functional.nll_loss(output, target.view(-1))\r\n        self.log(\"test-loss\", loss)<\/code><div class=\"copy-button\"><button class=\"expand-button active\">Expand<\/button><button class=\"copy\">Copy<\/button><\/div><\/pre>\n<p>Curious about <code>EarlyStopping<\/code>? Check out this video by our founder, Will Falcon to learn more!<\/p>\n<p><iframe loading=\"lazy\" title=\"YouTube video player\" src=\"https:\/\/www.youtube.com\/embed\/vfB5Ax6ekHo?si=O4W4zJQ1-GIn7Zxc\" width=\"560\" height=\"315\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\"><\/iframe><\/p>\n<h4>The configure_optimizers Method<\/h4>\n<p>PyTorch Lightning provides two main modes for managing the optimization process: manual and automatic. For the majority of research cases, automatic optimization will do the right thing for you and it is what most users should use, and we will be using automatic optimization with <code>configure_optimizers<\/code> as shown below.<\/p>\n<pre class=\"snippet-shortcode code-shortcode dark-theme collapse-false\"><code class=\"hljs language-python\">    def configure_optimizers(self):\r\n        return torch.optim.SGD(self.model.parameters(), lr=0.1)<\/code><div class=\"copy-button\"><button class=\"expand-button active\">Expand<\/button><button class=\"copy\">Copy<\/button><\/div><\/pre>\n<h3>Using Trainer<\/h3>\n<p>Okay \u2013 let\u2019s talk a little bit more about <code>Trainer<\/code>, the star of the show. Several years have gone into developing PyTorch Lightning, and especially <code>Trainer<\/code> and all that lies under the hood \u2013 like its ability to configure the environment as mentioned in the introduction.<\/p>\n<p>We\u2019ll take the code snippet for <code>Trainer<\/code> that was used in the introduction and modify it in a way that allows us to understand how to configure the environment and then conduct a training run.<\/p>\n<h4>Configuring Trainer with Flags<\/h4>\n<p>We can configure our environment when we first instantiate the <code>Trainer<\/code> object. This is done with flags like <code>devices<\/code>, <code>accelerator<\/code>, and <code>strategy<\/code> and by passing in our choice of <code>loggers<\/code>, <code>profilers<\/code>, and <code>callbacks<\/code>.<\/p>\n<pre class=\"snippet-shortcode code-shortcode dark-theme collapse-false\"><code class=\"hljs language-python\">import pytorch_lightning as pl\r\nfrom pytorch_lightning.callbacks import EarlyStopping\r\nfrom pytorch_lightning.loggers.wandb import WandbLogger\r\nfrom pytorch_lightning.profilers import PyTorchProfiler\r\n\r\ntrainer = pl.Trainer(\r\n    devices=\"auto\",\r\n    accelerator=\"auto\",\r\n    strategy=\"auto\",\r\n    precision=\"32-true\",\r\n    enable_checkpointing=True,\r\n    callbacks=EarlyStopping(monitor=\"val-loss\", mode=\"min\"),\r\n    logger=WandbLogger(name=\"textlab-demo\", save_dir=\"logs\/wandb\"),\r\n    profiler=PyTorchProfiler(dirpath=\"logs\/torch_profiler\"),\r\n)<\/code><div class=\"copy-button\"><button class=\"expand-button active\">Expand<\/button><button class=\"copy\">Copy<\/button><\/div><\/pre>\n<p>Aside from these additional features, you will also notice that we have set <code>devices<\/code>, <code>accelerator<\/code>, <code>strategy<\/code>, <code>precision<\/code>, and <code>enable_checkpointing<\/code>. Additional context for each is provided below:<\/p>\n<ul>\n<li><code>accelerator<\/code>: supports passing different accelerator types (&#8220;cpu&#8221;, &#8220;gpu&#8221;, &#8220;tpu&#8221;, &#8220;ipu&#8221;, &#8220;auto&#8221;) as well as custom accelerator instances.<\/li>\n<li><code>devices<\/code>: number of devices to train on (int), which devices to train on (list or str), or &#8220;auto&#8221;<\/li>\n<li><code>strategy<\/code>: supports passing different training strategies with aliases (\u201cddp\u201d, \u201cfsdp\u201d, etc) as well as configured strategies.<\/li>\n<li><code>precision<\/code>: check out <a href=\"https:\/\/youtu.be\/RJO05tlGQAI?feature=shared\" target=\"_blank\" rel=\"noopener\">this video<\/a> by Will on precision.<\/li>\n<li><code>enable_checkpointing<\/code>: saves a checkpoint for you in your current working directory, with the state of your last training epoch.<\/li>\n<\/ul>\n<p>In the snippet shown above, we are also enabling our trainer session with additional features like <code>EarlyStopping<\/code>, <code>WandbLogger<\/code>, and <code>PyTorchProfiler<\/code>.<\/p>\n<p>Why is using <code>PyTorchProfiler<\/code> handy? Because it allows us to find bottlenecks in our training loop. If you\u2019re interested in learning more \u2013 here\u2019s our <a href=\"https:\/\/lightning.ai\/docs\/pytorch\/stable\/tuning\/profiler_intermediate.html\" target=\"_blank\" rel=\"noopener\">official documentation<\/a> on the topic.<\/p>\n<p>Why use <code>WanbdLogger<\/code>? Because it will allow us to visualize and compare our experiments with interactive graphs and tables on the Weights and Biases platform.<\/p>\n<h4>Training the Model<\/h4>\n<p>Now that we have passed in our appropriate flags, callbacks, and support plugins, we are ready to train the model! We can start the training with the three easy lines shown below.<\/p>\n<pre class=\"snippet-shortcode code-shortcode dark-theme collapse-false\"><code class=\"hljs language-python\"># instantiate the datamodule\r\ndatamodule = LabDataModule() \r\n# instantiate the model\r\nmodel = LabModule() \r\n# call .fit\r\ntrainer.fit(model=model, datamodule=datamodule) <\/code><div class=\"copy-button\"><button class=\"expand-button active\">Expand<\/button><button class=\"copy\">Copy<\/button><\/div><\/pre>\n<p>If you\u2019ve cloned and installed the demo repo, <a href=\"https:\/\/github.com\/JustinGoheen\/text-lab\" target=\"_blank\" rel=\"noopener\">text-lab,<\/a> then you can test out what we\u2019ve done above with one of the following commands in the terminal.<\/p>\n<p>To run Trainer in fast-dev-run mode, use this command:<\/p>\n<pre class=\"snippet-shortcode code-shortcode dark-theme collapse-false\"><code class=\"hljs language-python\">lab run dev-run<\/code><div class=\"copy-button\"><button class=\"expand-button active\">Expand<\/button><button class=\"copy\">Copy<\/button><\/div><\/pre>\n<p>Otherwise, you can test a full demo run and log it with CSVLogger with:<\/p>\n<pre class=\"snippet-shortcode code-shortcode dark-theme collapse-false\"><code class=\"hljs language-python\">lab run demo-run<\/code><div class=\"copy-button\"><button class=\"expand-button active\">Expand<\/button><button class=\"copy\">Copy<\/button><\/div><\/pre>\n<p>And if you have a Weights and Biases account, you can log your run with:<\/p>\n<pre class=\"snippet-shortcode code-shortcode dark-theme collapse-false\"><code class=\"hljs language-python\">lab run demo-run --logger wandb<\/code><div class=\"copy-button\"><button class=\"expand-button active\">Expand<\/button><button class=\"copy\">Copy<\/button><\/div><\/pre>\n<h2>Great Work!<\/h2>\n<p>Above we learned how to organize our PyTorch code into a <code>LightningDataModule<\/code>, <code>LightningModule<\/code>, and to automate everything else with <code>Trainer<\/code>. By doing so, we trained a language Transformer on the WikiText2 dataset and even saw how we could create custom classes by implementing inheritance in Python interfaces (class objects). We also used a custom CLI built with Typer.<\/p>\n<h2>Conclusion<\/h2>\n<p>Research and production code can quickly grow from simple modules to feature-rich packages as technical needs like distributed training and quantization arise. PyTorch Lightning has implemented these features for you. Making those features abstracted from the code you care about most \u2013 the research code focused on training your model.<\/p>\n<p>By using PyTorch Lightning you inherit well tested features, which translates to faster prototyping thanks to fewer bugs and faster iterations. You also inherit code written for researchers by researchers. PyTorch Lightning\u2019s contributors are individuals in their Ph.D. candidacy or they are working as research scientists or research engineers at leading AI labs.<\/p>\n<p>PyTorch Lightning structures our code into four cohesive segments: data code, engineering code, research code, and support code (loggers, profilers). Compartmentalization according to task helps us to organize our code base, increasing readability and reusability. In turn, this creates a more maintainable code base that is suitable across the spectrum of beginners and experts.<\/p>\n<p>Remember \u2013 PyTorch Lightning is also not a replacement framework for PyTorch. Instead, it is a philosophy and methodology of organizing your PyTorch code to create reproducible state of the art research at scale \u2013 with ease.<\/p>\n<h2>Resources<\/h2>\n<p><strong>Official Documentation<\/strong><\/p>\n<ul>\n<li><a href=\"https:\/\/lightning.ai\/docs\/pytorch\/stable\/data\/datamodule.html#lightningdatamodule\" target=\"_blank\" rel=\"noopener\">LightningDataModule<\/a><\/li>\n<li><a href=\"https:\/\/lightning.ai\/docs\/pytorch\/stable\/common\/lightning_module.html\" target=\"_blank\" rel=\"noopener\">LightningModule<\/a><\/li>\n<li><a href=\"https:\/\/lightning.ai\/docs\/pytorch\/stable\/common\/trainer.html\" target=\"_blank\" rel=\"noopener\">Trainer<\/a><\/li>\n<\/ul>\n<p><strong>Code<\/strong><\/p>\n<ul>\n<li><a href=\"https:\/\/github.com\/JustinGoheen\/text-lab\" target=\"_blank\" rel=\"noopener\">Text Lab<\/a><\/li>\n<\/ul>\n<h2>Still have questions?<\/h2>\n<p>We have an amazing community and team of core engineers ready to answer questions you might have about PyTorch Lightning and the rest of the Lightning ecosystem. So, join us on <a href=\"https:\/\/lightning.ai\/forums\/\" target=\"_blank\" rel=\"noopener\">Discourse<\/a> or <a href=\"https:\/\/discord.gg\/XncpTy7DSt\" target=\"_blank\" rel=\"noopener\">Discord<\/a>. See you there!<\/p>\n","protected":false},"excerpt":{"rendered":"<p>The code in this tutorial is available on GitHub in the text-lab repo. Clone the repo and follow along! Introduction Training deep learning models at scale is an incredibly interesting and complex task. Reproducibility for projects is key, and reproducible code bases are exactly what we get when we leverage PyTorch Lightning for training and<a class=\"excerpt-read-more\" href=\"https:\/\/lightning.ai\/pages\/blog\/pl-tutorial-and-overview\/\" title=\"ReadPyTorch Lightning for Dummies &#8211; A Tutorial and Overview\">&#8230; Read more &raquo;<\/a><\/p>\n","protected":false},"author":39,"featured_media":5649102,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"inline_featured_image":false,"footnotes":"","_links_to":"","_links_to_target":""},"categories":[27,29],"tags":[],"glossary":[],"acf":{"mathjax":false,"default_editor":true,"show_table_of_contents":false,"additional_authors":false,"hide_from_archive":false,"content_type":"Blog Post","sticky":false,"code_embed":true,"tabs":false,"custom_styles":"","code_shortcode":[{"shortcode_title":"snippet_1","code":"import pytorch_lightning as pl\r\n\r\nclass LabModule(pl.LightningModule):\r\n    def __init__(self, vocab_size: int = 33278):\r\n        super().__init__()\r\n        self.model = Transformer(vocab_size=vocab_size)","syntax":"python","collapse":true},{"shortcode_title":"snippet_2","code":"import pytorch_lightning as pl\r\n\r\nclass LabDataModule(pl.LightningDataModule):\r\n    def __init__(self):\r\n        super().__init__()","syntax":"python","collapse":true},{"shortcode_title":"snippet_3","code":"import pytorch_lightning as pl\r\n\r\nclass LabModule(pl.LightningModule):\r\n    def __init__(self):\r\n        super().__init__()","syntax":"python","collapse":true},{"shortcode_title":"snippet_4","code":"import pytorch_lightning as pl\r\n\r\n# instantiate the trainer\r\ntrainer = pl.Trainer() \r\n\r\n# instantiate the datamodule\r\ndatamodule = LabDataModule() \r\n# instantiate the model\r\nmodel = LabModule() \r\n\r\n# call fit to start training\r\ntrainer.fit(model=model, datamodule=datamodule) ","syntax":"python","collapse":true},{"shortcode_title":"snippet_5","code":"pip install pytorch-lightning","syntax":"python","collapse":true},{"shortcode_title":"snippet_6","code":"pip install torchtext","syntax":"python","collapse":true},{"shortcode_title":"snippet_7","code":"import pytorch_lightning as pl\r\n\r\nclass LabDataModule(pl.LightningDataModule):\r\n    def __init__(self):\r\n        super().__init__()","syntax":"python","collapse":true},{"shortcode_title":"snippet_8","code":"from pathlib import Path\r\n\r\nfrom torch.utils.data import DataLoader, random_split\r\n\r\nimport pytorch_lightning as pl\r\n\r\nfrom textlab import Config\r\nfrom textlab.pipeline import LabDataset\r\n\r\nclass LabDataModule(pl.LightningDataModule):\r\n    def __init__(\r\n        self,\r\n        num_workers: int = 2,\r\n        data_dir: Path =\"data\",\r\n        block_size: int = 35,\r\n        download: bool = True,\r\n        train_size: float = 0.8,\r\n    ):\r\n        super().__init__()\r\n        self.data_dir = data_dir\r\n        self.block_size = block_size\r\n        self.download = download\r\n        self.num_workers = num_workers\r\n        self.train_size = train_size\r\n        self.dataset = None","syntax":"python","collapse":true},{"shortcode_title":"snippet_9","code":"    def prepare_data(self):\r\n        self.dataset = LabDataset(\r\n            data_dir=self.data_dir,\r\n            block_size=self.block_size,\r\n            download=self.download\r\n    )\r\n","syntax":"python","collapse":true},{"shortcode_title":"snippet_10","code":"    def setup(self, stage: str):\r\n        if stage == \"fit\" or stage is None:\r\n            train_size = int(len(self.dataset) * self.train_size)\r\n            test_size = len(self.dataset) - train_size\r\n            self.train_data, self.val_data = random_split(self.dataset, lengths=[train_size, test_size])\r\n        if stage == \"test\" or stage is None:\r\n            self.test_data = self.val_data","syntax":"python","collapse":true},{"shortcode_title":"snippet_11","code":"    def train_dataloader(self):\r\n        return DataLoader(self.train_data, num_workers=self.num_workers)\r\n\r\n    def val_dataloader(self):\r\n        return DataLoader(self.val_data, num_workers=self.num_workers)\r\n\r\n    def test_dataloader(self):\r\n        return DataLoader(self.test_data, num_workers=self.num_workers)","syntax":"python","collapse":true},{"shortcode_title":"snippet_12","code":"class LabModule(pl.LightningModule):\r\n    def __init__(self, vocab_size: int = 33278):\r\n        super().__init__()\r\n        self.model = Transformer(vocab_size=vocab_size)\r\n\r\n    def forward(self, inputs, target):\r\n        return self.model(inputs, target)\r\n\r\n    def training_step(self, batch, batch_idx):\r\n        inputs, target = batch\r\n        output = self(inputs, target)\r\n        loss = torch.nn.functional.nll_loss(output, target.view(-1))\r\n        return loss","syntax":"python","collapse":true},{"shortcode_title":"snippet_13","code":"    def training_step(self, batch, batch_idx):\r\n        inputs, target = batch\r\n        output = self(inputs, target)\r\n        loss = torch.nn.functional.nll_loss(output, target.view(-1))\r\n        self.log(\"training-loss\", loss)\r\n        return loss","syntax":"python","collapse":true},{"shortcode_title":"snippet_14","code":"    def validation_step(self, batch, batch_idx):\r\n        inputs, target = batch\r\n        output = self(inputs, target)\r\n        loss = torch.nn.functional.nll_loss(output, target.view(-1))\r\n        self.log(\"val-loss\", loss)\r\n\r\n    def test_step(self, batch, batch_idx):\r\n        inputs, target = batch\r\n        output = self(inputs, target)\r\n        loss = torch.nn.functional.nll_loss(output, target.view(-1))\r\n        self.log(\"test-loss\", loss)","syntax":"python","collapse":true},{"shortcode_title":"snippet_15","code":"    def configure_optimizers(self):\r\n        return torch.optim.SGD(self.model.parameters(), lr=0.1)","syntax":"python","collapse":true},{"shortcode_title":"snippet_16","code":"import pytorch_lightning as pl\r\nfrom pytorch_lightning.callbacks import EarlyStopping\r\nfrom pytorch_lightning.loggers.wandb import WandbLogger\r\nfrom pytorch_lightning.profilers import PyTorchProfiler\r\n\r\ntrainer = pl.Trainer(\r\n    devices=\"auto\",\r\n    accelerator=\"auto\",\r\n    strategy=\"auto\",\r\n    precision=\"32-true\",\r\n    enable_checkpointing=True,\r\n    callbacks=EarlyStopping(monitor=\"val-loss\", mode=\"min\"),\r\n    logger=WandbLogger(name=\"textlab-demo\", save_dir=\"logs\/wandb\"),\r\n    profiler=PyTorchProfiler(dirpath=\"logs\/torch_profiler\"),\r\n)","syntax":"python","collapse":true},{"shortcode_title":"snippet_17","code":"# instantiate the datamodule\r\ndatamodule = LabDataModule() \r\n# instantiate the model\r\nmodel = LabModule() \r\n# call .fit\r\ntrainer.fit(model=model, datamodule=datamodule) ","syntax":"python","collapse":true},{"shortcode_title":"snippet_18","code":"lab run dev-run","syntax":"python","collapse":true},{"shortcode_title":"snippet_19","code":"lab run demo-run","syntax":"python","collapse":true},{"shortcode_title":"snippet_20","code":"lab run demo-run --logger wandb","syntax":"python","collapse":true}]},"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v24.5 - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>PyTorch Lightning for Dummies - A Tutorial and Overview - Lightning AI<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/lightning.ai\/pages\/blog\/pl-tutorial-and-overview\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"PyTorch Lightning for Dummies - A Tutorial and Overview - Lightning AI\" \/>\n<meta property=\"og:description\" content=\"The code in this tutorial is available on GitHub in the text-lab repo. Clone the repo and follow along! Introduction Training deep learning models at scale is an incredibly interesting and complex task. Reproducibility for projects is key, and reproducible code bases are exactly what we get when we leverage PyTorch Lightning for training and... Read more &raquo;\" \/>\n<meta property=\"og:url\" content=\"https:\/\/lightning.ai\/pages\/blog\/pl-tutorial-and-overview\/\" \/>\n<meta property=\"og:site_name\" content=\"Lightning AI\" \/>\n<meta property=\"article:published_time\" content=\"2023-10-17T22:37:27+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2023-10-17T22:42:15+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/lightningaidev.wpengine.com\/wp-content\/uploads\/2023\/10\/Screenshot-2023-10-12-at-1.22.40-PM.png\" \/>\n\t<meta property=\"og:image:width\" content=\"1262\" \/>\n\t<meta property=\"og:image:height\" content=\"818\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/png\" \/>\n<meta name=\"author\" content=\"Lightning.ai\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@LightningAI\" \/>\n<meta name=\"twitter:site\" content=\"@LightningAI\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Lightning.ai\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"11 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\/\/lightning.ai\/pages\/blog\/pl-tutorial-and-overview\/#article\",\"isPartOf\":{\"@id\":\"https:\/\/lightning.ai\/pages\/blog\/pl-tutorial-and-overview\/\"},\"author\":{\"name\":\"Lightning.ai\",\"@id\":\"https:\/\/lightning.ai\/pages\/#\/schema\/person\/d53c9386be275d278c59022570c0d859\"},\"headline\":\"PyTorch Lightning for Dummies &#8211; A Tutorial and Overview\",\"datePublished\":\"2023-10-17T22:37:27+00:00\",\"dateModified\":\"2023-10-17T22:42:15+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\/\/lightning.ai\/pages\/blog\/pl-tutorial-and-overview\/\"},\"wordCount\":2113,\"commentCount\":0,\"publisher\":{\"@id\":\"https:\/\/lightning.ai\/pages\/#organization\"},\"image\":{\"@id\":\"https:\/\/lightning.ai\/pages\/blog\/pl-tutorial-and-overview\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/lightningaidev.wpengine.com\/wp-content\/uploads\/2023\/10\/Screenshot-2023-10-12-at-1.22.40-PM.png\",\"articleSection\":[\"Articles\",\"Blog\"],\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"CommentAction\",\"name\":\"Comment\",\"target\":[\"https:\/\/lightning.ai\/pages\/blog\/pl-tutorial-and-overview\/#respond\"]}]},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/lightning.ai\/pages\/blog\/pl-tutorial-and-overview\/\",\"url\":\"https:\/\/lightning.ai\/pages\/blog\/pl-tutorial-and-overview\/\",\"name\":\"PyTorch Lightning for Dummies - A Tutorial and Overview - Lightning AI\",\"isPartOf\":{\"@id\":\"https:\/\/lightning.ai\/pages\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/lightning.ai\/pages\/blog\/pl-tutorial-and-overview\/#primaryimage\"},\"image\":{\"@id\":\"https:\/\/lightning.ai\/pages\/blog\/pl-tutorial-and-overview\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/lightningaidev.wpengine.com\/wp-content\/uploads\/2023\/10\/Screenshot-2023-10-12-at-1.22.40-PM.png\",\"datePublished\":\"2023-10-17T22:37:27+00:00\",\"dateModified\":\"2023-10-17T22:42:15+00:00\",\"breadcrumb\":{\"@id\":\"https:\/\/lightning.ai\/pages\/blog\/pl-tutorial-and-overview\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/lightning.ai\/pages\/blog\/pl-tutorial-and-overview\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/lightning.ai\/pages\/blog\/pl-tutorial-and-overview\/#primaryimage\",\"url\":\"https:\/\/lightningaidev.wpengine.com\/wp-content\/uploads\/2023\/10\/Screenshot-2023-10-12-at-1.22.40-PM.png\",\"contentUrl\":\"https:\/\/lightningaidev.wpengine.com\/wp-content\/uploads\/2023\/10\/Screenshot-2023-10-12-at-1.22.40-PM.png\",\"width\":1262,\"height\":818},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/lightning.ai\/pages\/blog\/pl-tutorial-and-overview\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/lightning.ai\/pages\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"PyTorch Lightning for Dummies &#8211; A Tutorial and Overview\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/lightning.ai\/pages\/#website\",\"url\":\"https:\/\/lightning.ai\/pages\/\",\"name\":\"Lightning AI\",\"description\":\"The platform for teams to build AI.\",\"publisher\":{\"@id\":\"https:\/\/lightning.ai\/pages\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/lightning.ai\/pages\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Organization\",\"@id\":\"https:\/\/lightning.ai\/pages\/#organization\",\"name\":\"Lightning AI\",\"url\":\"https:\/\/lightning.ai\/pages\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/lightning.ai\/pages\/#\/schema\/logo\/image\/\",\"url\":\"https:\/\/lightningaidev.wpengine.com\/wp-content\/uploads\/2023\/02\/image-17.png\",\"contentUrl\":\"https:\/\/lightningaidev.wpengine.com\/wp-content\/uploads\/2023\/02\/image-17.png\",\"width\":1744,\"height\":856,\"caption\":\"Lightning AI\"},\"image\":{\"@id\":\"https:\/\/lightning.ai\/pages\/#\/schema\/logo\/image\/\"},\"sameAs\":[\"https:\/\/x.com\/LightningAI\"]},{\"@type\":\"Person\",\"@id\":\"https:\/\/lightning.ai\/pages\/#\/schema\/person\/d53c9386be275d278c59022570c0d859\",\"name\":\"Lightning.ai\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/lightning.ai\/pages\/#\/schema\/person\/image\/\",\"url\":\"https:\/\/secure.gravatar.com\/avatar\/b75fef9be69cb600f385dfba5525cf77?s=96&d=mm&r=g\",\"contentUrl\":\"https:\/\/secure.gravatar.com\/avatar\/b75fef9be69cb600f385dfba5525cf77?s=96&d=mm&r=g\",\"caption\":\"Lightning.ai\"},\"url\":\"https:\/\/lightning.ai\/pages\/author\/lightning-ai\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"PyTorch Lightning for Dummies - A Tutorial and Overview - Lightning AI","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/lightning.ai\/pages\/blog\/pl-tutorial-and-overview\/","og_locale":"en_US","og_type":"article","og_title":"PyTorch Lightning for Dummies - A Tutorial and Overview - Lightning AI","og_description":"The code in this tutorial is available on GitHub in the text-lab repo. Clone the repo and follow along! Introduction Training deep learning models at scale is an incredibly interesting and complex task. Reproducibility for projects is key, and reproducible code bases are exactly what we get when we leverage PyTorch Lightning for training and... Read more &raquo;","og_url":"https:\/\/lightning.ai\/pages\/blog\/pl-tutorial-and-overview\/","og_site_name":"Lightning AI","article_published_time":"2023-10-17T22:37:27+00:00","article_modified_time":"2023-10-17T22:42:15+00:00","og_image":[{"width":1262,"height":818,"url":"https:\/\/lightningaidev.wpengine.com\/wp-content\/uploads\/2023\/10\/Screenshot-2023-10-12-at-1.22.40-PM.png","type":"image\/png"}],"author":"Lightning.ai","twitter_card":"summary_large_image","twitter_creator":"@LightningAI","twitter_site":"@LightningAI","twitter_misc":{"Written by":"Lightning.ai","Est. reading time":"11 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/lightning.ai\/pages\/blog\/pl-tutorial-and-overview\/#article","isPartOf":{"@id":"https:\/\/lightning.ai\/pages\/blog\/pl-tutorial-and-overview\/"},"author":{"name":"Lightning.ai","@id":"https:\/\/lightning.ai\/pages\/#\/schema\/person\/d53c9386be275d278c59022570c0d859"},"headline":"PyTorch Lightning for Dummies &#8211; A Tutorial and Overview","datePublished":"2023-10-17T22:37:27+00:00","dateModified":"2023-10-17T22:42:15+00:00","mainEntityOfPage":{"@id":"https:\/\/lightning.ai\/pages\/blog\/pl-tutorial-and-overview\/"},"wordCount":2113,"commentCount":0,"publisher":{"@id":"https:\/\/lightning.ai\/pages\/#organization"},"image":{"@id":"https:\/\/lightning.ai\/pages\/blog\/pl-tutorial-and-overview\/#primaryimage"},"thumbnailUrl":"https:\/\/lightningaidev.wpengine.com\/wp-content\/uploads\/2023\/10\/Screenshot-2023-10-12-at-1.22.40-PM.png","articleSection":["Articles","Blog"],"inLanguage":"en-US","potentialAction":[{"@type":"CommentAction","name":"Comment","target":["https:\/\/lightning.ai\/pages\/blog\/pl-tutorial-and-overview\/#respond"]}]},{"@type":"WebPage","@id":"https:\/\/lightning.ai\/pages\/blog\/pl-tutorial-and-overview\/","url":"https:\/\/lightning.ai\/pages\/blog\/pl-tutorial-and-overview\/","name":"PyTorch Lightning for Dummies - A Tutorial and Overview - Lightning AI","isPartOf":{"@id":"https:\/\/lightning.ai\/pages\/#website"},"primaryImageOfPage":{"@id":"https:\/\/lightning.ai\/pages\/blog\/pl-tutorial-and-overview\/#primaryimage"},"image":{"@id":"https:\/\/lightning.ai\/pages\/blog\/pl-tutorial-and-overview\/#primaryimage"},"thumbnailUrl":"https:\/\/lightningaidev.wpengine.com\/wp-content\/uploads\/2023\/10\/Screenshot-2023-10-12-at-1.22.40-PM.png","datePublished":"2023-10-17T22:37:27+00:00","dateModified":"2023-10-17T22:42:15+00:00","breadcrumb":{"@id":"https:\/\/lightning.ai\/pages\/blog\/pl-tutorial-and-overview\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/lightning.ai\/pages\/blog\/pl-tutorial-and-overview\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/lightning.ai\/pages\/blog\/pl-tutorial-and-overview\/#primaryimage","url":"https:\/\/lightningaidev.wpengine.com\/wp-content\/uploads\/2023\/10\/Screenshot-2023-10-12-at-1.22.40-PM.png","contentUrl":"https:\/\/lightningaidev.wpengine.com\/wp-content\/uploads\/2023\/10\/Screenshot-2023-10-12-at-1.22.40-PM.png","width":1262,"height":818},{"@type":"BreadcrumbList","@id":"https:\/\/lightning.ai\/pages\/blog\/pl-tutorial-and-overview\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/lightning.ai\/pages\/"},{"@type":"ListItem","position":2,"name":"PyTorch Lightning for Dummies &#8211; A Tutorial and Overview"}]},{"@type":"WebSite","@id":"https:\/\/lightning.ai\/pages\/#website","url":"https:\/\/lightning.ai\/pages\/","name":"Lightning AI","description":"The platform for teams to build AI.","publisher":{"@id":"https:\/\/lightning.ai\/pages\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/lightning.ai\/pages\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/lightning.ai\/pages\/#organization","name":"Lightning AI","url":"https:\/\/lightning.ai\/pages\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/lightning.ai\/pages\/#\/schema\/logo\/image\/","url":"https:\/\/lightningaidev.wpengine.com\/wp-content\/uploads\/2023\/02\/image-17.png","contentUrl":"https:\/\/lightningaidev.wpengine.com\/wp-content\/uploads\/2023\/02\/image-17.png","width":1744,"height":856,"caption":"Lightning AI"},"image":{"@id":"https:\/\/lightning.ai\/pages\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/x.com\/LightningAI"]},{"@type":"Person","@id":"https:\/\/lightning.ai\/pages\/#\/schema\/person\/d53c9386be275d278c59022570c0d859","name":"Lightning.ai","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/lightning.ai\/pages\/#\/schema\/person\/image\/","url":"https:\/\/secure.gravatar.com\/avatar\/b75fef9be69cb600f385dfba5525cf77?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/b75fef9be69cb600f385dfba5525cf77?s=96&d=mm&r=g","caption":"Lightning.ai"},"url":"https:\/\/lightning.ai\/pages\/author\/lightning-ai\/"}]}},"_links":{"self":[{"href":"https:\/\/lightning.ai\/pages\/wp-json\/wp\/v2\/posts\/5649098"}],"collection":[{"href":"https:\/\/lightning.ai\/pages\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/lightning.ai\/pages\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/lightning.ai\/pages\/wp-json\/wp\/v2\/users\/39"}],"replies":[{"embeddable":true,"href":"https:\/\/lightning.ai\/pages\/wp-json\/wp\/v2\/comments?post=5649098"}],"version-history":[{"count":0,"href":"https:\/\/lightning.ai\/pages\/wp-json\/wp\/v2\/posts\/5649098\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/lightning.ai\/pages\/wp-json\/wp\/v2\/media\/5649102"}],"wp:attachment":[{"href":"https:\/\/lightning.ai\/pages\/wp-json\/wp\/v2\/media?parent=5649098"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/lightning.ai\/pages\/wp-json\/wp\/v2\/categories?post=5649098"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/lightning.ai\/pages\/wp-json\/wp\/v2\/tags?post=5649098"},{"taxonomy":"glossary","embeddable":true,"href":"https:\/\/lightning.ai\/pages\/wp-json\/wp\/v2\/glossary?post=5649098"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}