{"id":5648375,"date":"2023-07-13T08:00:44","date_gmt":"2023-07-13T12:00:44","guid":{"rendered":"https:\/\/lightning.ai\/pages\/?p=5648375"},"modified":"2023-07-24T12:31:51","modified_gmt":"2023-07-24T16:31:51","slug":"tabular-classification-with-lightning","status":"publish","type":"post","link":"https:\/\/lightning.ai\/pages\/blog\/tabular-classification-with-lightning\/","title":{"rendered":"Tabular Classification with Lightning"},"content":{"rendered":"<div class=\"takeaways card-glow p-4 my-4\"><h3 class=\"w-100 d-block\">Takeaways<\/h3>Learn how to gain back research time by leveraging PyTorch Lightning for over 100 inbuilt methods, hooks, and flags that save you engineering hours on heavy lifts like distributed training in multi-GPU and multi-node environments. <\/div>\n<h2>What is Lightning?<\/h2>\n<p><a href=\"https:\/\/lightningaidev.wpengine.com\/wp-content\/uploads\/2023\/07\/logo-with-text.dbb8ae5cb40e12dfa5b61ce3e0092fba.svg\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone wp-image-5648376\" role=\"img\" src=\"https:\/\/lightningaidev.wpengine.com\/wp-content\/uploads\/2023\/07\/logo-with-text.dbb8ae5cb40e12dfa5b61ce3e0092fba.svg\" alt=\"\" width=\"400\" height=\"133\" \/><\/a><\/p>\n<p>The framework known as <a href=\"https:\/\/github.com\/Lightning-AI\/lightning\" target=\"_blank\" rel=\"noopener\">Lightning<\/a> is <a href=\"https:\/\/lightning.ai\/docs\/pytorch\/latest\/\" target=\"_blank\" rel=\"noopener\">PyTorch Lightning<\/a>. Or perhaps it is better to simply say that \u2013 Lightning <em>contains<\/em> PyTorch Lightning. This changed in mid-2022 when PyTorch Lightning was unified with <a href=\"https:\/\/lightning.ai\/docs\/app\/stable\/\" target=\"_blank\" rel=\"noopener\">Lightning Apps<\/a> under a single framework and rebranded as Lightning. As of early 2023, the Lightning repository also includes <a href=\"https:\/\/lightning.ai\/docs\/fabric\/stable\/\" target=\"_blank\" rel=\"noopener\">Lightning Fabric<\/a> \u2013 a lightweight way to scale PyTorch models while returning control over the training loop back to the engineer.<\/p>\n<p>Neither PyTorch Lightning or Lightning Fabric are meant to replace PyTorch. Meaning we still need to implement algorithms with PyTorch. Lightning\u2019s PyTorch Lightning and Fabric help us to manage the training process of those PyTorch implementations. Code examples shared later in this post will clarify what this relationship between PyTorch and Lightning looks like.<\/p>\n<h3>Building with Lightning<\/h3>\n<p>Now that Lightning AI\u2019s frameworks are unified into a single repository and framework known as Lightning, this means that when researchers install Lightning, each of PyTorch Lightning, Lightning Apps, and Lightning Fabric are installed, along with our metrics library \u2013\u00a0<a href=\"https:\/\/torchmetrics.readthedocs.io\/en\/stable\/\" target=\"_blank\" rel=\"noopener\">TorchMetrics<\/a>.<\/p>\n<blockquote><div class=\"perfect-pullquote vcard pullquote-align-full pullquote-border-placement-left\"><blockquote><p>Researchers, ML Engineers, and Data Scientists also have the option to install each framework individually with one of: <code style=\"font-size: 16px; background-color: #e4e6eb;\">pip install pytorch-lightning<\/code>, <code style=\"font-size: 16px; background-color: #e4e6eb;\">pip install lightning-apps<\/code>, or <code style=\"font-size: 16px; background-color: #e4e6eb;\">pip install lightning-fabric<\/code><\/p><\/blockquote><\/div><\/blockquote>\n<p>The key takeaway here is that Lightning is still the trustworthy PyTorch Lightning framework that allows you to easily scale your models without having to write your own distributed training code or write your own training and evaluation loops \u2013 winning you back precious time that can be allocated back to research.<\/p>\n<h2>Why Lightning<\/h2>\n<p>Lightning AI saves you time by devoting engineering hours to maintaining Lightning and tackling the tasks you shouldn\u2019t have to handle as a domain researcher, like distributed training on CUDA devices. PyTorch Lightning and Lightning Fabric each enable researchers to focus on the research aspect of their codebase instead of implementing custom <a href=\"https:\/\/lightning.ai\/docs\/pytorch\/stable\/common\/trainer.html#trainer-class-api\" target=\"_blank\" rel=\"noopener\">methods and properties<\/a>, <a href=\"https:\/\/lightning.ai\/docs\/pytorch\/stable\/api_references.html#callbacks\" target=\"_blank\" rel=\"noopener\">callbacks<\/a>, or <a href=\"https:\/\/lightning.ai\/docs\/pytorch\/stable\/extensions\/plugins.html#plugins\" target=\"_blank\" rel=\"noopener\">plugins<\/a> on their own.<\/p>\n<p>How does this compare to NumPy and Pandas? Using PyTorch instead of some combination of NumPy, Pandas, and Python standard means that you won\u2019t have to write your own <a href=\"https:\/\/pytorch.org\/blog\/overview-of-pytorch-autograd-engine\/\" target=\"_blank\" rel=\"noopener\">autograd engine<\/a>. Using PyTorch Lightning or Lightning Fabric with PyTorch means you won\u2019t have to write your own distributed training code with Python and possibly <a href=\"https:\/\/docs.nvidia.com\/cuda\/cuda-c-programming-guide\/index.html\" target=\"_blank\" rel=\"noopener\">CUDA C++<\/a>.<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone wp-image-5648380 size-large\" src=\"https:\/\/lightningaidev.wpengine.com\/wp-content\/uploads\/2023\/07\/Add-a-heading-1-1024x576.png\" alt=\"\" width=\"1024\" height=\"576\" srcset=\"https:\/\/lightningaidev.wpengine.com\/wp-content\/uploads\/2023\/07\/Add-a-heading-1-1024x576.png 1024w, https:\/\/lightningaidev.wpengine.com\/wp-content\/uploads\/2023\/07\/Add-a-heading-1-300x169.png 300w, https:\/\/lightningaidev.wpengine.com\/wp-content\/uploads\/2023\/07\/Add-a-heading-1-1536x864.png 1536w, https:\/\/lightningaidev.wpengine.com\/wp-content\/uploads\/2023\/07\/Add-a-heading-1.png 1600w, https:\/\/lightningaidev.wpengine.com\/wp-content\/uploads\/2023\/07\/Add-a-heading-1-300x169@2x.png 600w\" sizes=\"(max-width: 1024px) 100vw, 1024px\" \/><\/p>\n<h2>Using Lightning with PyTorch<\/h2>\n<p>What does \u201cUsing Lightning with PyTorch\u201d mean exactly? It means that researchers and engineers can focus on writing their model as <a href=\"https:\/\/lightning.ai\/docs\/pytorch\/stable\/common\/lightning_module.html#child-modules\" target=\"_blank\" rel=\"noopener\">Child Modules<\/a> in PyTorch before writing a custom <a href=\"https:\/\/lightning.ai\/docs\/pytorch\/latest\/common\/lightning_module.html\" target=\"_blank\" rel=\"noopener\">LightningModule<\/a> or <a href=\"https:\/\/lightning.ai\/docs\/fabric\/stable\/fundamentals\/convert.html\" target=\"_blank\" rel=\"noopener\">adding Lightning Fabric<\/a> into a vanilla PyTorch training loop. A Child Module is the algorithm that has been implemented in PyTorch and will be trained with PyTorch Lightning or Lightning Fabric. A high-level example of this concept is shown below in the code block.<\/p>\n<pre class=\"code-shortcode dark-theme window- collapse-false \" style=\"--height:falsepx\"><code class=\"language-python\">\n\n<pre>import lightning as L\r\nimport torch\r\nfrom torch import nn\r\n\r\n\r\nclass MyCustomTorchModule(nn.Module):\r\n    def __init__(self, in_features: int, num_classes: int):\r\n        super().__init__()\r\n        linear1 = nn.Linear(\r\n            in_features=in_features,\r\n            out_features=in_features,\r\n        )\r\n        relu = nn.ReLU()\r\n        linear2 = nn.Linear(\r\n            in_features=in_features,\r\n            out_features=num_classes,\r\n        )\r\n        self.sequential = nn.Sequential([linear1, relu, linear2])\r\n\r\n    def forward(self, x):\r\n        return self.sequential(x)\r\n\r\n\r\nclass MyCustomLightningModule(L.LightningModule):\r\n    def __init__(self, in_features: int, num_classes: int):\r\n        super().__init__()\r\n        self.model = MyCustomTorchModule(in_features=in_features, num_classes=num_classes)\r\n\r\n    def forward(self, x: torch.Tensor):\r\n        return self.model(x)\r\n<\/pre>\n<\/code><div class=\"copy-button\"><button class=\"expand-button\">Expand<\/button><button class=\"copy\">Copy<\/button><\/div><\/pre>\n<p>In the example repo \u2013 <a href=\"https:\/\/github.com\/JustinGoheen\/lightning-quant\/tree\/main\" target=\"_blank\" rel=\"noopener\">Lightning Quant<\/a>, an <a href=\"https:\/\/lightning.ai\/courses\/deep-learning-fundamentals\/overview-organizing-your-code-with-pytorch-lightning\/5-2-training-a-multilayer-perceptron-using-the-lightning-trainer\/\" target=\"_blank\" rel=\"noopener\">MLP<\/a> is provided as a Child Module in <a href=\"https:\/\/github.com\/JustinGoheen\/lightning-quant\/blob\/main\/src\/lightning_quant\/models\/mlp.py\" target=\"_blank\" rel=\"noopener\">lightning_quant.models.mlp<\/a>. Lightning Quant is a Deep Learning library for training algorithmic trading agents with PyTorch Lightning and Lightning Fabric. Data provided in the repo allows for the reproducibility of the examples shown in this post.<\/p>\n<p>Lightning Quant\u2019s MLP is also shown below \u2013 we\u2019ll stick with MLP throughout this post as the architecture is easy to understand and does not require much explanation. Note that MLP <a href=\"https:\/\/docs.python.org\/3\/tutorial\/classes.html#inheritance\" target=\"_blank\" rel=\"noopener\">inherits from<\/a> PyTorch\u2019s <a href=\"https:\/\/pytorch.org\/docs\/stable\/generated\/torch.nn.Module.html\" target=\"_blank\" rel=\"noopener\">nn.Module<\/a>, making MLP a subclass of <code style=\"font-size: 16px; background-color: #e4e6eb;\">nn.Module<\/code> and allowing MLP to make use of methods available to <code style=\"font-size: 16px; background-color: #e4e6eb;\">nn.Module<\/code> through this inheritance.<\/p>\n<p>ElasticNet Pseudocode<br \/>\n<pre class=\"code-shortcode dark-theme window- collapse-false \" style=\"--height:falsepx\"><code class=\"language-python\">\n\n<pre>class LinearModel(nn.Module):\r\n    def __init__(self, in_features: int, num_classes: int):\r\n        super().__init__()\r\n        self.linear = nn.Linear(\r\n            in_features=in_features,\r\n            out_features=num_classes,\r\n        )\r\n\r\n    def forward(self, x):\r\n        return self.linear(x)\r\n\r\n\r\nclass ElasticNet(L.LightningModule):\r\n    def __init__(self, in_features: int, num_classes: int):\r\n        super().__init__()\r\n        self.model = LinearModel(in_features=in_features, num_classes=num_classes)\r\n\r\n    def forward(self, x: torch.Tensor):\r\n        return self.model(x)\r\n<\/pre>\n<\/code><div class=\"copy-button\"><button class=\"expand-button\">Expand<\/button><button class=\"copy\">Copy<\/button><\/div><\/pre><\/p>\n<p>ElasticNet Code<br \/>\n<pre class=\"code-shortcode dark-theme window- collapse-300 \" style=\"--height:300px\"><code class=\"language-python\">\n\n<pre>import lightning as L\r\nimport torch\r\nimport torch.nn.functional as F\r\nfrom torch import nn, optim\r\nfrom torchmetrics.functional import accuracy\r\nfrom lightning_quant.core.metrics import regularization\r\n\r\n\r\nclass ElasticNet(L.LightningModule):\r\n    \"\"\"Logistic Regression with L1 and L2 Regularization\"\"\"\r\n\r\n    def __init__(\r\n        self,\r\n        in_features: int,\r\n        num_classes: int,\r\n        bias: bool = False,\r\n        lr: float = 0.001,\r\n        l1_strength: float = 0.1,\r\n        l2_strength: float = 0.1,\r\n        optimizer=\"Adam\",\r\n        accuracy_task: str = \"multiclass\",\r\n        dtype=\"float32\",\r\n    ):\r\n        super().__init__()\r\n\r\n        if \"32\" in dtype and torch.cuda.is_available():\r\n            torch.set_float32_matmul_precision(\"medium\")\r\n\r\n        self.lr = lr\r\n        self.l1_strength = l1_strength\r\n        self.l2_strength = l2_strength\r\n        self.accuracy_task = accuracy_task\r\n        self.num_classes = num_classes\r\n        self._dtype = getattr(torch, dtype)\r\n        self.optimizer = getattr(optim, optimizer)\r\n        self.model = nn.Linear(\r\n            in_features=in_features,\r\n            out_features=num_classes,\r\n            bias=bias,\r\n            dtype=self._dtype,\r\n        )\r\n        self.save_hyperparameters()\r\n\r\n    def forward(self, x: torch.Tensor):\r\n        return self.model(x.to(self._dtype))\r\n\r\n    def training_step(self, batch):\r\n        return self.common_step(batch, \"training\")\r\n\r\n    def test_step(self, batch, *args):\r\n        self.common_step(batch, \"test\")\r\n\r\n    def validation_step(self, batch, *args):\r\n        self.common_step(batch, \"val\")\r\n\r\n    def common_step(self, batch, stage):\r\n        \"\"\"consolidates common code for train, test, and validation steps\"\"\"\r\n        x, y = batch\r\n        x = x.to(self._dtype)\r\n        y = y.to(torch.long)\r\n        y_hat = self.model(x)\r\n        criterion = F.cross_entropy(y_hat, y)\r\n        loss = regularization(\r\n            self.model,\r\n            criterion,\r\n            self.l1_strength,\r\n            self.l2_strength,\r\n        )\r\n\r\n        if stage == \"training\":\r\n            self.log(f\"{stage}_loss\", loss)\r\n            return loss\r\n        if stage in [\"val\", \"test\"]:\r\n            acc = accuracy(\r\n                y_hat.argmax(dim=-1),\r\n                y,\r\n                task=self.accuracy_task,\r\n                num_classes=self.num_classes,\r\n            )\r\n            self.log(f\"{stage}_acc\", acc)\r\n            self.log(f\"{stage}_loss\", loss)\r\n\r\n    def predict_step(self, batch, batch_idx, dataloader_idx=0):\r\n        x, y = batch\r\n        y_hat = self(x)\r\n        y_hat = y_hat.argmax(dim=-1)\r\n        return y_hat\r\n\r\n    def configure_optimizers(self):\r\n        \"\"\"configures the ``torch.optim`` used in training loop\"\"\"\r\n        optimizer = self.optimizer(\r\n            self.parameters(),\r\n            lr=self.lr,\r\n        )\r\n        return optimizer\r\n<\/pre>\n<\/code><div class=\"copy-button\"><button class=\"expand-button\">Expand<\/button><button class=\"copy\">Copy<\/button><\/div><\/pre><\/p>\n<p>After defining the Child Module as <code style=\"font-size: 16px; background-color: #e4e6eb;\">MLP<\/code>, we need to implement either a LightningModule or a custom PyTorch training loop with Lightning Fabric. <code style=\"font-size: 16px; background-color: #e4e6eb;\">ElasticNetMLP<\/code>, the accompanying LightningModule for MLP is shown below, and you will notice <code style=\"font-size: 16px; background-color: #e4e6eb;\">training_step<\/code>, <code style=\"font-size: 16px; background-color: #e4e6eb;\">test_step<\/code>, <code style=\"font-size: 16px; background-color: #e4e6eb;\">validation_step<\/code>, <code style=\"font-size: 16px; background-color: #e4e6eb;\">predict_step<\/code>, and <code style=\"font-size: 16px; background-color: #e4e6eb;\">configure_optimizer<\/code> . These methods are all examples of ways Lightning has extended <code style=\"font-size: 16px; background-color: #e4e6eb;\">nn.Module<\/code> to provide researchers with easy to implement training and evaluation loops in Lightning.Trainer.<\/p>\n<p>Aside from these 5 methods, LightningModule provides researchers with 16 <a href=\"https:\/\/lightning.ai\/docs\/pytorch\/stable\/common\/lightning_module.html#lightningmodule-api\" target=\"_blank\" rel=\"noopener\">additional methods<\/a> and <a href=\"https:\/\/lightning.ai\/docs\/pytorch\/stable\/api\/lightning.pytorch.core.hooks.ModelHooks.html\" target=\"_blank\" rel=\"noopener\">36 model hooks<\/a>. Lightning Fabric, being more lightweight, provides researchers with <a href=\"https:\/\/lightning.ai\/docs\/fabric\/stable\/api\/fabric_args.html\" target=\"_blank\" rel=\"noopener\">8 arguments<\/a> to configure a model\u2019s settings like precision and the distributed training strategy and <a href=\"https:\/\/lightning.ai\/docs\/fabric\/stable\/api\/fabric_methods.html\" target=\"_blank\" rel=\"noopener\">17 methods<\/a> to assist in the training process.<\/p>\n<p>In the example below, we will notice that <code style=\"font-size: 16px; background-color: #e4e6eb;\">ElasticNetMLP<\/code> <a href=\"https:\/\/docs.python.org\/3\/tutorial\/classes.html#inheritance\" target=\"_blank\" rel=\"noopener\">inherits from<\/a> LightningModule \u2013\u00a0meaning that <code style=\"font-size: 16px; background-color: #e4e6eb;\">ElasticNetMLP<\/code> is now a subclass of LightningModule and can be used in conjunction with Lightning.Trainer to train the MLP. In order to train MLP, we first import it into the Python module, and then assign it as <code style=\"font-size: 16px; background-color: #e4e6eb;\">self.model<\/code> in <code style=\"font-size: 16px; background-color: #e4e6eb;\">ElasticNetMLP<\/code> . Doing so makes MLP accessible in the other class methods &#8211; notably the <code style=\"font-size: 16px; background-color: #e4e6eb;\">training_step<\/code>, <code style=\"font-size: 16px; background-color: #e4e6eb;\">test_step<\/code>, <code style=\"font-size: 16px; background-color: #e4e6eb;\">validation_step<\/code>, and <code style=\"font-size: 16px; background-color: #e4e6eb;\">predict_step<\/code> methods.<\/p>\n<p>ElasticNetMLP Pseudocode<\/p>\n<pre class=\"code-shortcode dark-theme window- collapse-false \" style=\"--height:falsepx\"><code class=\"language-python\">\n\n<pre>class MLP(nn.Module):\r\n    def __init__(self, in_features: int, num_classes: int):\r\n        super().__init__()\r\n        linear1 = nn.Linear(\r\n            in_features=in_features,\r\n            out_features=in_features,\r\n        )\r\n        relu = nn.ReLU()\r\n        linear2 = nn.Linear(\r\n            in_features=in_features,\r\n            out_features=num_classes,\r\n        )\r\n        self.sequential = nn.Sequential([linear1, relu, linear2])\r\n\r\n    def forward(self, x):\r\n        return self.sequential(x)\r\n\r\n\r\nclass ElasticNetMLP(L.LightningModule):\r\n    def __init__(self, in_features: int, num_classes: int):\r\n        super().__init__()\r\n        self.model = MLP(in_features=in_features, num_classes=num_classes)\r\n\r\n    def forward(self, x: torch.Tensor):\r\n        return self.model(x)\r\n<\/pre>\n<\/code><div class=\"copy-button\"><button class=\"expand-button\">Expand<\/button><button class=\"copy\">Copy<\/button><\/div><\/pre>\n<details>\n<summary>ElasticNetMLP Code<\/summary>\n<pre class=\"code-shortcode dark-theme window- collapse-false \" style=\"--height:falsepx\"><code class=\"language-python\">\n<pre>import lightning as L\r\n\r\nimport torch\r\nimport torch.nn.functional as F\r\nfrom torch import nn, optim\r\nfrom torchmetrics.functional import accuracy\r\nfrom lightning_quant.core.metrics import regularization\r\n\r\n\r\nclass ElasticNetMLP(L.LightningModule):\r\n    \"\"\"Logistic Regression with L1 and L2 Regularization\"\"\"\r\n\r\n    def __init__(\r\n        self,\r\n        in_features: int,\r\n        num_classes: int,\r\n        bias: bool = False,\r\n        lr: float = 0.001,\r\n        l1_strength: float = 0.5,\r\n        l2_strength: float = 0.5,\r\n        optimizer=\"Adam\",\r\n        accuracy_task: str = \"multiclass\",\r\n        dtype=\"float32\",\r\n    ):\r\n        super().__init__()\r\n\r\n        self.lr = lr\r\n        self.l1_strength = l1_strength\r\n        self.l2_strength = l2_strength\r\n        self.accuracy_task = accuracy_task\r\n        self.num_classes = num_classes\r\n        self._dtype = getattr(torch, dtype)\r\n        self.optimizer = getattr(optim, optimizer)\r\n        self.model = MLP(\r\n            in_features=in_features,\r\n            num_classes=num_classes,\r\n            bias=bias,\r\n            dtype=self._dtype,\r\n        )\r\n        self.save_hyperparameters()\r\n\r\n    def forward(self, x: torch.Tensor):\r\n        return self.model(x)\r\n\r\n    def training_step(self, batch):\r\n        return self.common_step(batch, \"training\")\r\n\r\n    def test_step(self, batch, *args):\r\n        self.common_step(batch, \"test\")\r\n\r\n    def validation_step(self, batch, *args):\r\n        self.common_step(batch, \"val\")\r\n\r\n    def common_step(self, batch, stage):\r\n        x, y = batch\r\n        x = x.to(self._dtype)\r\n        y = y.to(torch.long)\r\n        y_hat = self(x)\r\n        criterion = F.cross_entropy(y_hat, y)\r\n        loss = regularization(\r\n            self.model,\r\n            criterion,\r\n            self.l1_strength,\r\n            self.l2_strength,\r\n        )\r\n\r\n        if stage == \"training\":\r\n            self.log(f\"{stage}_loss\", loss)\r\n            return loss\r\n        if stage in [\"val\", \"test\"]:\r\n            acc = accuracy(\r\n                y_hat.argmax(dim=-1),\r\n                y,\r\n                task=self.accuracy_task,\r\n                num_classes=self.num_classes,\r\n            )\r\n            self.log(f\"{stage}_acc\", acc)\r\n            self.log(f\"{stage}_loss\", loss)\r\n\r\n    def predict_step(self, batch, batch_idx, dataloader_idx=0):\r\n        x, y = batch\r\n        y_hat = self(x)\r\n        y_hat = y_hat.argmax(dim=-1)\r\n        return y_hat\r\n\r\n    def configure_optimizers(self):\r\n        optimizer = self.optimizer(\r\n            self.parameters(),\r\n            lr=self.lr,\r\n        )\r\n        return optimizer\r\n<\/pre>\n<\/code><div class=\"copy-button\"><button class=\"expand-button\">Expand<\/button><button class=\"copy\">Copy<\/button><\/div><\/pre><\/details>\n<p>What makes the example above so user-friendly is that PyTorch Lightning has established conventions on how we name certain class methods for the training loop. Meaning that Lightning.Trainer is going to call each of the above methods at an appropriate time during the training and eval loop. So, not only do we not have to write our own trainer when we use LightningModule in conjunction with Lightning.Trainer \u2013\u00a0we also don\u2019t have to worry about naming conventions for class methods or class attributes. In this way, Lightning is a North Star for Deep Learning best practices and conventions.<\/p>\n<h2>Training with Lightning.Trainer and Lightning Fabric<\/h2>\n<p>Let\u2019s remember that PyTorch Lightning and Lightning Fabric are not meant to replace PyTorch and instead are frameworks created by Lightning AI that enable a better training experience for domain researchers. A notable difference between PyTorch Lightning and Lightning Fabric is how researchers will implement a training loop. In PyTorch Lightning, we have <a href=\"https:\/\/lightning.ai\/docs\/pytorch\/stable\/common\/trainer.html\" target=\"_blank\" rel=\"noopener\">Lightning.Trainer<\/a>, which provides around <a href=\"https:\/\/lightning.ai\/docs\/pytorch\/stable\/common\/trainer.html#trainer-flags\" target=\"_blank\" rel=\"noopener\">40 flags<\/a> to assist in automating <a href=\"https:\/\/lightning.ai\/docs\/pytorch\/LTS\/extensions\/loops.html\" target=\"_blank\" rel=\"noopener\">training loops<\/a> \u2013 whereas the lightweight Fabric allows you to <a href=\"https:\/\/github.com\/Lightning-AI\/lightning\/tree\/master\/examples\/fabric\/build_your_own_trainer\" target=\"_blank\" rel=\"noopener\">build your own trainer<\/a>. You may be asking \u2013 why are there two ways to do this in Lightning, and what makes Fabric \u201clightweight\u201d? We\u2019ll cover that below with visual examples.<\/p>\n<h3>Lightning.Trainer<\/h3>\n<p>In the code block below, we have an example of a custom trainer built with Lightning.Trainer \u2013 one of the two Core API classes of PyTorch Lightning. While the code may look simple, as it is only 35 lines \u2013 it abstracts an entire framework that has taken since 2019 to build and is the result of contributions from several hundred international contributors.<\/p>\n<p>If you\u2019ve ever written your own training loops from scratch for statistical learning with NumPy, or with PyTorch for Deep Learning, then you\u2019ll notice the immediate convenience of using Lightning.Trainer. If you haven\u2019t, that is okay too \u2013 the ease of use of the Lightning.Trainer will become apparent as we write custom trainers with Lightning Fabric. This isn\u2019t to say that Lightning Fabric is difficult to use, instead \u2013 we are acknowledging the trade-offs between a managed training loop in Lightning.Trainer and having nearly full control over the loop with Lightning Fabric.<\/p>\n<pre class=\"code-shortcode dark-theme window- collapse-false \" style=\"--height:falsepx\"><code class=\"language-python\">\n\n<pre>import lightning as L\r\nfrom lightning.pytorch import seed_everything\r\nfrom lightning.pytorch.callbacks import ModelCheckpoint\r\nfrom lightning.pytorch.loggers import Logger, TensorBoardLogger\r\nfrom lightning.pytorch.profilers import Profiler\r\n\r\nimport torch\r\nfrom typing import Any, Dict, List, Optional, Union\r\n\r\nclass QuantLightningTrainer(L.Trainer):\r\n    \"\"\"A custom Lightning.LightningTrainer\"\"\"\r\n\r\n    def __init__(\r\n        self,\r\n        logger: Optional[Logger] = None,\r\n        profiler: Optional[Profiler] = None,\r\n        callbacks: Optional[List] = [],\r\n        plugins: Optional[List] = [],\r\n        set_seed: bool = True,\r\n        seed: int = 42,\r\n        profiler_logs: Optional[str] = None,\r\n        tensorboard_logs: Optional[str] = None,\r\n        checkpoints_dir: Optional[str] = None,\r\n        **trainer_init_kwargs: Dict[str, Any]\r\n    ) -&gt; None:\r\n        \r\n        if set_seed:\r\n            seed_everything(seed, workers=True)\r\n\r\n        super().__init__(\r\n            logger=logger,\r\n            profiler=profiler,\r\n            callbacks=callbacks, filename=\"model\")],\r\n            plugins=plugins,\r\n            **trainer_init_kwargs\r\n        )\r\n<\/pre>\n<\/code><div class=\"copy-button\"><button class=\"expand-button\">Expand<\/button><button class=\"copy\">Copy<\/button><\/div><\/pre>\n<p>As for the custom Lightning.Trainer shown above \u2013 it is entirely possible to keep this out of a Python class object and instead run this from a script or in a Jupyter Notebook. The implementation as a custom class has more to do with the fact that Lightning Quant provides a CLI app built with Typer, and that CLI app calls <code style=\"font-size: 16px; background-color: #e4e6eb;\">QuantLightningTrainer<\/code> as a command. Providing a CLI in Lightning Quant means that researchers can train agents from the command line by calling the app with <code style=\"font-size: 16px; background-color: #e4e6eb;\">quant run trainer<\/code> or <code style=\"font-size: 16px; background-color: #e4e6eb;\">quant run fabric<\/code> \u2013 we will discuss the CLI App in greater detail after covering Lightning Fabric.<\/p>\n<h3>Lightning Fabric<\/h3>\n<p>Again, Lightning Fabric is the lightweight companion to Lightning.Trainer. The heavy lifting that we have done for you in Lightning Fabric centers on device logic and distributed training. Lightning Fabric requires that we build our own training loop. This can be as simple as the implementation shown below or as complex as this example shown in how to <a href=\"https:\/\/github.com\/Lightning-AI\/lightning\/tree\/master\/examples\/fabric\/build_your_own_trainer\" target=\"_blank\" rel=\"noopener\">build your own trainer<\/a>.<\/p>\n<p>The custom Lightning Fabric trainer used in Lightning Quant will look very similar to training loops written with PyTorch \u2013\u00a0because Fabric provides a drop-in replacement for <a href=\"https:\/\/pytorch.org\/docs\/stable\/generated\/torch.Tensor.backward.html\" target=\"_blank\" rel=\"noopener\">torch.Tensor.backward<\/a> as <a href=\"https:\/\/lightning.ai\/docs\/fabric\/stable\/api\/fabric_methods.html#backward\" target=\"_blank\" rel=\"noopener\">lightning.fabric.backward<\/a>. Additionally, if you\u2019ve written training loops in NumPy and implemented that loop as bespoke trainer class, the PyTorch and Lightning Fabric training loop shown below should be very familiar to you.<\/p>\n<p>Fabric Trainer Pseudocode<br \/>\n<pre class=\"code-shortcode dark-theme window- collapse-false \" style=\"--height:falsepx\"><code class=\"language-python\">\n\n<pre>import lightning as L\r\n\r\nimport torch\r\nfrom lightning_quant.data.dataset import MarketDataset\r\nfrom lightning_quant.models.mlp import MLP\r\n\r\n\r\ndef training_step(self, batch):\r\n    \"\"\"a custom training step\"\"\"\r\n\r\nfabric = L.Fabric(\r\naccelerator=\"cpu\",\r\ndevices=\"auto\",\r\nstrategy=\"auto\",\r\nnum_nodes=1,\r\nmax_epochs=20,\r\nprecision=\"32-true\",\r\nloggers=None,\r\n)\r\n\r\nfabric.launch()\r\n\r\nmodel = MLP()\r\noptimizer = torch.optim.Adam(model.parameters(), lr=0.1)\r\nmodel, optimizer = fabric.setup(model, optimizer)\r\n\r\ndataset = MarketDataset()\r\ndataloader = torch.utils.data.DataLoader(dataset)\r\ndataloader = fabric.setup_dataloaders(dataloader)\r\n\r\nmodel.train()\r\nfor epoch in range(20):\r\n    for batch in dataloader:\r\n        loss = training_step()\r\n        fabric.log(\"loss\", loss)\r\n        fabric.backward(loss)\r\n        optimizer.step()\r\n\r\n\r\n<\/pre>\n<\/code><div class=\"copy-button\"><button class=\"expand-button\">Expand<\/button><button class=\"copy\">Copy<\/button><\/div><\/pre><\/p>\n<details>\n<summary>Fabric Trainer Code<\/summary>\n<pre class=\"code-shortcode dark-theme window- collapse-false \" style=\"--height:falsepx\"><code class=\"language-python\">\n<pre>import lightning as L\r\nfrom lightning.fabric.loggers import TensorBoardLogger\r\n\r\nimport torch\r\nimport torch.nn.functional as F\r\nfrom lightning_quant.core.metrics import regularization\r\n\r\n\r\nclass QuantFabricTrainer:\r\n    def __init__(\r\n        self,\r\n        accelerator=\"cpu\",\r\n        devices=\"auto\",\r\n        strategy=\"auto\",\r\n        num_nodes=1,\r\n        max_epochs=20,\r\n        precision=\"32-true\",\r\n        dtype=\"float32\",\r\n        matmul_precision=\"medium\",\r\n    ) -&gt; None:\r\n        \"\"\"A custom, minimal Lightning Fabric Trainer\"\"\"\r\n\r\n        if \"32\" in dtype and torch.cuda.is_available():\r\n            torch.set_float32_matmul_precision(matmul_precision)\r\n\r\n        self.fabric = L.Fabric(\r\n            accelerator=accelerator,\r\n            devices=devices,\r\n            strategy=strategy,\r\n            num_nodes=num_nodes,\r\n            precision=precision,\r\n            loggers=TensorBoardLogger(root_dir=\"logs\"),\r\n        )\r\n        self.fabric.launch()\r\n\r\n        self._dtype = getattr(torch, dtype)\r\n        self.max_epochs = max_epochs\r\n        self.loss = None\r\n        self.dataset = None\r\n        self.model = None\r\n\r\n    def fit(\r\n        self,\r\n        model,\r\n        dataset,\r\n        l1_strength: float = 0.1,\r\n        l2_strength: float = 0.1,\r\n    ) -&gt; None:\r\n        self.dataset = dataset\r\n        self.dataloader = torch.utils.data.DataLoader(self.dataset)\r\n\r\n        self.model = model\r\n        self.optimizer = torch.optim.Adam(self.model.parameters(), lr=0.1)\r\n        self.model, self.optimizer = self.fabric.setup(self.model, self.optimizer)\r\n        self.dataloader = self.fabric.setup_dataloaders(self.dataloader)\r\n\r\n        self.model.train()\r\n        for epoch in range(self.max_epochs):\r\n            for batch in self.dataloader:\r\n                input, target = batch\r\n                input = input.to(self._dtype)\r\n                self.optimizer.zero_grad()\r\n                output = self.model(input)\r\n                criterion = F.cross_entropy(output, target.to(torch.long))\r\n                self.loss = regularization(\r\n                    self.model,\r\n                    criterion,\r\n                    l1_strength=l1_strength,\r\n                    l2_strength=l2_strength,\r\n                )\r\n                self.fabric.log(\"loss\", self.loss)\r\n                self.fabric.backward(self.loss)\r\n                self.optimizer.step()\r\n<\/pre>\n<\/code><div class=\"copy-button\"><button class=\"expand-button\">Expand<\/button><button class=\"copy\">Copy<\/button><\/div><\/pre><\/details>\n<p>The example shown above is simple in that the training loop is only monitored by a sentinel for-loop that will terminate after <code style=\"font-size: 16px; background-color: #e4e6eb;\">max_epochs<\/code> has been reached. Meaning the example is not monitoring for early stoppage based on a particular metric\u2019s improvement or lack thereof \u2013 it is simply going to train for as many epochs as you\u2019ve told it to and has no concept of convergence criteria.<\/p>\n<p>What remains to be shown is how we actually go about training <code style=\"font-size: 16px; background-color: #e4e6eb;\">ElasticNetMLP<\/code> or <code style=\"font-size: 16px; background-color: #e4e6eb;\">MLP<\/code> with Lightning.Trainer or Lightning Fabric. We\u2019ll cover the training process along with the CLI app implementation in the next section.<\/p>\n<h2>Enabling Training from the Command Line<\/h2>\n<p>What is a CLI? A CLI is a Command Line Interface \u2013 a terminal app like those offered in <code style=\"font-size: 16px; background-color: #e4e6eb;\">pip<\/code>, <code style=\"font-size: 16px; background-color: #e4e6eb;\">conda<\/code>, <code style=\"font-size: 16px; background-color: #e4e6eb;\">poetry<\/code>, and <code style=\"font-size: 16px; background-color: #e4e6eb;\">git<\/code>; or standard Unix tools like <code style=\"font-size: 16px; background-color: #e4e6eb;\">ls<\/code> and <code style=\"font-size: 16px; background-color: #e4e6eb;\">ps<\/code>. There are several ways to build CLIs with Python. One is Python standard\u2019s <a href=\"https:\/\/docs.python.org\/3\/library\/argparse.html\" target=\"_blank\" rel=\"noopener\">argparse<\/a>. Another is <a href=\"https:\/\/click.palletsprojects.com\/en\/8.1.x\/\" target=\"_blank\" rel=\"noopener\">Click<\/a> by Pallet Projects. And yet another CLI framework is <a href=\"https:\/\/typer.tiangolo.com\/\" target=\"_blank\" rel=\"noopener\">Typer<\/a>, built by the creator of FastAPI. I\u2019ve opted to use Typer in Lightning Quant because it has Click and <a href=\"https:\/\/rich.readthedocs.io\/en\/latest\/\" target=\"_blank\" rel=\"noopener\">Rich<\/a> under the hood \u2013 making for a familiar interface and great-looking formatting in the terminal.<\/p>\n<p>Back to the topic at hand \u2013 training a PyTorch model with PyTorch Lightning or Lightning Fabric. Below we have two implementations, one for each trainer. We will cover the implementation for Lightning.Trainer first.<\/p>\n<div class=\"perfect-pullquote vcard pullquote-align-full pullquote-border-placement-left\"><blockquote><p>The Typer code has been removed from the examples in order to keep the code concise. For the full CLI, please see <a href=\"https:\/\/github.com\/JustinGoheen\/lightning-quant\/blob\/main\/src\/lightning_quant\/cli\/interface.py\" target=\"_blank\" rel=\"noopener\">lightning_quant.cli.interface<\/a><\/p><\/blockquote><\/div>\n<h3>Running Lightning.Trainer<\/h3>\n<p>Recall from above that the custom Lightning.Trainer is named <code style=\"font-size: 16px; background-color: #e4e6eb;\">QuantLightningTrainer<\/code>, and the custom LightningModule is named <code style=\"font-size: 16px; background-color: #e4e6eb;\">ElasticNetMLP<\/code>. You will also notice <a href=\"https:\/\/github.com\/JustinGoheen\/lightning-quant\/blob\/main\/src\/lightning_quant\/data\/datamodule.py\" target=\"_blank\" rel=\"noopener\">MarketDataModule<\/a> in the import statements. <code style=\"font-size: 16px; background-color: #e4e6eb;\">MarketDataModule<\/code> is a custom <a href=\"https:\/\/lightning.ai\/docs\/pytorch\/stable\/data\/datamodule.html\" target=\"_blank\" rel=\"noopener\">LightningDataModule<\/a> that provides the PyTorch <a href=\"https:\/\/pytorch.org\/docs\/stable\/data.html#torch.utils.data.DataLoader\" target=\"_blank\" rel=\"noopener\">DataLoaders<\/a> to <code style=\"font-size: 16px; background-color: #e4e6eb;\">QuantLightningTrainer<\/code>.<\/p>\n<pre class=\"code-shortcode dark-theme window- collapse-false \" style=\"--height:falsepx\"><code class=\"language-python\">\n\n<pre>from lightning_quant.core.lightning_trainer import QuantLightningTrainer\r\nfrom lightning_quant.data.datamodule import MarketDataModule\r\nfrom lightning_quant.models.mlp import ElasticNetMLP\r\n\r\nmodel = ElasticNetMLP(in_features=6, num_classes=num_classes)\r\n\r\ndatamodule = MarketDataModule()\r\n\r\ntrainer = QuantLightningTrainer(\r\n    devices=devices or \"auto\",\r\n    accelerator=accelerator,\r\n    strategy=strategy,\r\n    fast_dev_run=fast_dev_run,\r\n    precision=precision,\r\n    max_epochs=max_epochs,\r\n    callbacks=[EarlyStopping(\"training_loss\")],\r\n)\r\n\r\ntrainer.fit(model=model, datamodule=datamodule)\r\n<\/pre>\n<\/code><div class=\"copy-button\"><button class=\"expand-button\">Expand<\/button><button class=\"copy\">Copy<\/button><\/div><\/pre>\n<p>The above code is executed in Lightning Quant by calling quant run trainer from the command line. The trainer command is configured to receive several options that can be passed to <code style=\"font-size: 16px; background-color: #e4e6eb;\">QuantLightningTrainer<\/code>. Those additional options are pictured below.<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"aligncenter wp-image-5648377 size-large\" src=\"https:\/\/lightningaidev.wpengine.com\/wp-content\/uploads\/2023\/07\/Screenshot-2023-07-12-at-7.07.03-PM-1024x507.png\" alt=\"\" width=\"1024\" height=\"507\" srcset=\"https:\/\/lightningaidev.wpengine.com\/wp-content\/uploads\/2023\/07\/Screenshot-2023-07-12-at-7.07.03-PM-1024x507.png 1024w, https:\/\/lightningaidev.wpengine.com\/wp-content\/uploads\/2023\/07\/Screenshot-2023-07-12-at-7.07.03-PM-300x149.png 300w, https:\/\/lightningaidev.wpengine.com\/wp-content\/uploads\/2023\/07\/Screenshot-2023-07-12-at-7.07.03-PM-1536x761.png 1536w, https:\/\/lightningaidev.wpengine.com\/wp-content\/uploads\/2023\/07\/Screenshot-2023-07-12-at-7.07.03-PM.png 1704w, https:\/\/lightningaidev.wpengine.com\/wp-content\/uploads\/2023\/07\/Screenshot-2023-07-12-at-7.07.03-PM-300x149@2x.png 600w\" sizes=\"(max-width: 1024px) 100vw, 1024px\" \/><\/p>\n<h3>Running the Lightning Fabric Trainer<\/h3>\n<p>Using the <code style=\"font-size: 16px; background-color: #e4e6eb;\">quant<\/code> CLI to run the Lightning Fabric trainer is accomplished in the same manner as running the Lightning.Trainer. To do so, simply call <code style=\"font-size: 16px; background-color: #e4e6eb;\">quant run fabric<\/code> from the command line to run the code shown below.<\/p>\n<pre class=\"code-shortcode dark-theme window- collapse-false \" style=\"--height:falsepx\"><code class=\"language-python\">\n\n<pre>from lightning_quant.data.dataset import MarketDataset\r\nfrom lightning_quant.core.fabric_trainer import QuantFabricTrainer\r\nfrom lightning_quant.models.mlp import MLP\r\n\r\nmodel = MLP(in_features=6, num_classes=num_classes)\r\n    \r\n    dataset = MarketDataset()\r\n    \r\n    trainer = QuantFabricTrainer(\r\n        accelerator=accelerator,\r\n        devices=devices,\r\n        strategy=strategy,\r\n        num_nodes=num_nodes,\r\n        max_epochs=max_epochs,\r\n        precision=precision,\r\n    )\r\n\r\n    trainer.fit(model, dataset)\r\n<\/pre>\n<\/code><div class=\"copy-button\"><button class=\"expand-button\">Expand<\/button><button class=\"copy\">Copy<\/button><\/div><\/pre>\n<p>In the example above, you may notice that MarketDataset is imported instead of MarketDataModule. This is because <code style=\"font-size: 16px; background-color: #e4e6eb;\">QuantFabricTrainer<\/code> accepts MarketDataset as the dataset argument in the .fit method and then passes the dataset to a PyTorch DataLoader. Just as the trainer command is configured to receive several options that can be passed to <code style=\"font-size: 16px; background-color: #e4e6eb;\">QuantLightningTrainer<\/code>, fabric can also accept additional options to pass to <code style=\"font-size: 16px; background-color: #e4e6eb;\">QuantFabricTrainer<\/code>. The additional options for fabric are pictured below.<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone wp-image-5648381 size-large\" src=\"https:\/\/lightningaidev.wpengine.com\/wp-content\/uploads\/2023\/07\/Screenshot-2023-07-12-at-7.34.12-PM-1024x507.png\" alt=\"\" width=\"1024\" height=\"507\" srcset=\"https:\/\/lightningaidev.wpengine.com\/wp-content\/uploads\/2023\/07\/Screenshot-2023-07-12-at-7.34.12-PM-1024x507.png 1024w, https:\/\/lightningaidev.wpengine.com\/wp-content\/uploads\/2023\/07\/Screenshot-2023-07-12-at-7.34.12-PM-300x149.png 300w, https:\/\/lightningaidev.wpengine.com\/wp-content\/uploads\/2023\/07\/Screenshot-2023-07-12-at-7.34.12-PM-1536x761.png 1536w, https:\/\/lightningaidev.wpengine.com\/wp-content\/uploads\/2023\/07\/Screenshot-2023-07-12-at-7.34.12-PM.png 1704w, https:\/\/lightningaidev.wpengine.com\/wp-content\/uploads\/2023\/07\/Screenshot-2023-07-12-at-7.34.12-PM-300x149@2x.png 600w\" sizes=\"(max-width: 1024px) 100vw, 1024px\" \/><\/p>\n<h3>An Example of Using the CLI to Run a Training Session<\/h3>\n<p>Below is a quick video of using <code style=\"font-size: 16px; background-color: #e4e6eb;\">quant run fabric<\/code> in a CUDA-enabled environment.<\/p>\n<div style=\"width: 1200px;\" class=\"wp-video\"><!--[if lt IE 9]><script>document.createElement('video');<\/script><![endif]-->\n<video class=\"wp-video-shortcode\" id=\"video-5648375-1\" width=\"1200\" height=\"675\" preload=\"metadata\" controls=\"controls\"><source type=\"video\/mp4\" src=\"https:\/\/lightningaidev.wpengine.com\/wp-content\/uploads\/2023\/07\/Add-a-heading.mp4?_=1\" \/><a href=\"https:\/\/lightningaidev.wpengine.com\/wp-content\/uploads\/2023\/07\/Add-a-heading.mp4\">https:\/\/lightningaidev.wpengine.com\/wp-content\/uploads\/2023\/07\/Add-a-heading.mp4<\/a><\/video><\/div>\n<p>&nbsp;<\/p>\n<p>The CLI can configure <code style=\"font-size: 16px; background-color: #e4e6eb;\">QuantFabricTrainer<\/code> to run on a CUDA device or devices by setting the <code style=\"font-size: 16px; background-color: #e4e6eb;\">--accelerator<\/code> flag to either <code style=\"font-size: 16px; background-color: #e4e6eb;\">cuda<\/code> or <code style=\"font-size: 16px; background-color: #e4e6eb;\">gpu<\/code> and Lightning Fabric will enable the appropriate backend settings for your combination of <code style=\"font-size: 16px; background-color: #e4e6eb;\">accelerator<\/code> and <code style=\"font-size: 16px; background-color: #e4e6eb;\">precision<\/code>. An example of this is \u2013 Lightning Fabric may check for <a href=\"https:\/\/www.nvidia.com\/en-us\/data-center\/ampere-architecture\/\" target=\"_blank\" rel=\"noopener\">NVIDIA Ampere<\/a> Tensor cores before setting MATMUL precision. Meaning that while Fabric is lightweight, it is still accomplishing tasks for you that allow for a variety of system configurations without having to write the code on your own.<\/p>\n<h2>Conclusion<\/h2>\n<p>In this article, we learned how Lightning is both PyTorch Lightning and Lightning Fabric, and how these combined frameworks enable researchers with maximal flexibility and minimal boilerplate around raw PyTorch code. You can find the code and data used to create the examples shown above on GitHub at <a href=\"https:\/\/github.com\/JustinGoheen\/lightning-quant\" target=\"_blank\" rel=\"noopener\">Lightning Quant<\/a>.<\/p>\n<h3>Join the Community<\/h3>\n<p>Lightning AI is proud to be open-source and hosts a large community of engineers in our Discord. <a href=\"https:\/\/discord.gg\/jxvyg2zWG4\" target=\"_blank\" rel=\"noopener\">Join us<\/a> and be a part of the open-source effort to drive the world forward with artificial intelligence and Lightning!<\/p>\n","protected":false},"excerpt":{"rendered":"<p>What is Lightning? The framework known as Lightning is PyTorch Lightning. Or perhaps it is better to simply say that \u2013 Lightning contains PyTorch Lightning. This changed in mid-2022 when PyTorch Lightning was unified with Lightning Apps under a single framework and rebranded as Lightning. As of early 2023, the Lightning repository also includes Lightning<a class=\"excerpt-read-more\" href=\"https:\/\/lightning.ai\/pages\/blog\/tabular-classification-with-lightning\/\" title=\"ReadTabular Classification with Lightning\">&#8230; Read more &raquo;<\/a><\/p>\n","protected":false},"author":16,"featured_media":5648381,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"inline_featured_image":false,"footnotes":"","_links_to":"","_links_to_target":""},"categories":[29,41],"tags":[],"glossary":[],"acf":{"mathjax":false,"default_editor":true,"show_table_of_contents":false,"additional_authors":false,"hide_from_archive":false,"content_type":"Blog Post","sticky":false,"custom_styles":".single-blog-post details {\r\n    margin-bottom: 2rem;\r\n}"},"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v24.5 - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>Tabular Classification with Lightning - Lightning AI<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/lightning.ai\/pages\/blog\/tabular-classification-with-lightning\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Tabular Classification with Lightning - Lightning AI\" \/>\n<meta property=\"og:description\" content=\"What is Lightning? The framework known as Lightning is PyTorch Lightning. Or perhaps it is better to simply say that \u2013 Lightning contains PyTorch Lightning. This changed in mid-2022 when PyTorch Lightning was unified with Lightning Apps under a single framework and rebranded as Lightning. As of early 2023, the Lightning repository also includes Lightning... Read more &raquo;\" \/>\n<meta property=\"og:url\" content=\"https:\/\/lightning.ai\/pages\/blog\/tabular-classification-with-lightning\/\" \/>\n<meta property=\"og:site_name\" content=\"Lightning AI\" \/>\n<meta property=\"article:published_time\" content=\"2023-07-13T12:00:44+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2023-07-24T16:31:51+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/lightningaidev.wpengine.com\/wp-content\/uploads\/2023\/07\/Screenshot-2023-07-12-at-7.34.12-PM.png\" \/>\n\t<meta property=\"og:image:width\" content=\"1704\" \/>\n\t<meta property=\"og:image:height\" content=\"844\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/png\" \/>\n<meta name=\"author\" content=\"JP Hennessy\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:creator\" content=\"@LightningAI\" \/>\n<meta name=\"twitter:site\" content=\"@LightningAI\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"JP Hennessy\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"18 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\/\/lightning.ai\/pages\/blog\/tabular-classification-with-lightning\/#article\",\"isPartOf\":{\"@id\":\"https:\/\/lightning.ai\/pages\/blog\/tabular-classification-with-lightning\/\"},\"author\":{\"name\":\"JP Hennessy\",\"@id\":\"https:\/\/lightning.ai\/pages\/#\/schema\/person\/2518f4d5541f8e98016f6289169141a6\"},\"headline\":\"Tabular Classification with Lightning\",\"datePublished\":\"2023-07-13T12:00:44+00:00\",\"dateModified\":\"2023-07-24T16:31:51+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\/\/lightning.ai\/pages\/blog\/tabular-classification-with-lightning\/\"},\"wordCount\":2063,\"commentCount\":0,\"publisher\":{\"@id\":\"https:\/\/lightning.ai\/pages\/#organization\"},\"image\":{\"@id\":\"https:\/\/lightning.ai\/pages\/blog\/tabular-classification-with-lightning\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/lightningaidev.wpengine.com\/wp-content\/uploads\/2023\/07\/Screenshot-2023-07-12-at-7.34.12-PM.png\",\"articleSection\":[\"Blog\",\"Tutorials\"],\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"CommentAction\",\"name\":\"Comment\",\"target\":[\"https:\/\/lightning.ai\/pages\/blog\/tabular-classification-with-lightning\/#respond\"]}]},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/lightning.ai\/pages\/blog\/tabular-classification-with-lightning\/\",\"url\":\"https:\/\/lightning.ai\/pages\/blog\/tabular-classification-with-lightning\/\",\"name\":\"Tabular Classification with Lightning - Lightning AI\",\"isPartOf\":{\"@id\":\"https:\/\/lightning.ai\/pages\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/lightning.ai\/pages\/blog\/tabular-classification-with-lightning\/#primaryimage\"},\"image\":{\"@id\":\"https:\/\/lightning.ai\/pages\/blog\/tabular-classification-with-lightning\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/lightningaidev.wpengine.com\/wp-content\/uploads\/2023\/07\/Screenshot-2023-07-12-at-7.34.12-PM.png\",\"datePublished\":\"2023-07-13T12:00:44+00:00\",\"dateModified\":\"2023-07-24T16:31:51+00:00\",\"breadcrumb\":{\"@id\":\"https:\/\/lightning.ai\/pages\/blog\/tabular-classification-with-lightning\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/lightning.ai\/pages\/blog\/tabular-classification-with-lightning\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/lightning.ai\/pages\/blog\/tabular-classification-with-lightning\/#primaryimage\",\"url\":\"https:\/\/lightningaidev.wpengine.com\/wp-content\/uploads\/2023\/07\/Screenshot-2023-07-12-at-7.34.12-PM.png\",\"contentUrl\":\"https:\/\/lightningaidev.wpengine.com\/wp-content\/uploads\/2023\/07\/Screenshot-2023-07-12-at-7.34.12-PM.png\",\"width\":1704,\"height\":844},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/lightning.ai\/pages\/blog\/tabular-classification-with-lightning\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/lightning.ai\/pages\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Tabular Classification with Lightning\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/lightning.ai\/pages\/#website\",\"url\":\"https:\/\/lightning.ai\/pages\/\",\"name\":\"Lightning AI\",\"description\":\"The platform for teams to build AI.\",\"publisher\":{\"@id\":\"https:\/\/lightning.ai\/pages\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/lightning.ai\/pages\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Organization\",\"@id\":\"https:\/\/lightning.ai\/pages\/#organization\",\"name\":\"Lightning AI\",\"url\":\"https:\/\/lightning.ai\/pages\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/lightning.ai\/pages\/#\/schema\/logo\/image\/\",\"url\":\"https:\/\/lightningaidev.wpengine.com\/wp-content\/uploads\/2023\/02\/image-17.png\",\"contentUrl\":\"https:\/\/lightningaidev.wpengine.com\/wp-content\/uploads\/2023\/02\/image-17.png\",\"width\":1744,\"height\":856,\"caption\":\"Lightning AI\"},\"image\":{\"@id\":\"https:\/\/lightning.ai\/pages\/#\/schema\/logo\/image\/\"},\"sameAs\":[\"https:\/\/x.com\/LightningAI\"]},{\"@type\":\"Person\",\"@id\":\"https:\/\/lightning.ai\/pages\/#\/schema\/person\/2518f4d5541f8e98016f6289169141a6\",\"name\":\"JP Hennessy\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/lightning.ai\/pages\/#\/schema\/person\/image\/\",\"url\":\"https:\/\/secure.gravatar.com\/avatar\/28ade268218ae45f723b0b62499f527a?s=96&d=mm&r=g\",\"contentUrl\":\"https:\/\/secure.gravatar.com\/avatar\/28ade268218ae45f723b0b62499f527a?s=96&d=mm&r=g\",\"caption\":\"JP Hennessy\"},\"url\":\"https:\/\/lightning.ai\/pages\/author\/jplightning-ai\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Tabular Classification with Lightning - Lightning AI","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/lightning.ai\/pages\/blog\/tabular-classification-with-lightning\/","og_locale":"en_US","og_type":"article","og_title":"Tabular Classification with Lightning - Lightning AI","og_description":"What is Lightning? The framework known as Lightning is PyTorch Lightning. Or perhaps it is better to simply say that \u2013 Lightning contains PyTorch Lightning. This changed in mid-2022 when PyTorch Lightning was unified with Lightning Apps under a single framework and rebranded as Lightning. As of early 2023, the Lightning repository also includes Lightning... Read more &raquo;","og_url":"https:\/\/lightning.ai\/pages\/blog\/tabular-classification-with-lightning\/","og_site_name":"Lightning AI","article_published_time":"2023-07-13T12:00:44+00:00","article_modified_time":"2023-07-24T16:31:51+00:00","og_image":[{"width":1704,"height":844,"url":"https:\/\/lightningaidev.wpengine.com\/wp-content\/uploads\/2023\/07\/Screenshot-2023-07-12-at-7.34.12-PM.png","type":"image\/png"}],"author":"JP Hennessy","twitter_card":"summary_large_image","twitter_creator":"@LightningAI","twitter_site":"@LightningAI","twitter_misc":{"Written by":"JP Hennessy","Est. reading time":"18 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/lightning.ai\/pages\/blog\/tabular-classification-with-lightning\/#article","isPartOf":{"@id":"https:\/\/lightning.ai\/pages\/blog\/tabular-classification-with-lightning\/"},"author":{"name":"JP Hennessy","@id":"https:\/\/lightning.ai\/pages\/#\/schema\/person\/2518f4d5541f8e98016f6289169141a6"},"headline":"Tabular Classification with Lightning","datePublished":"2023-07-13T12:00:44+00:00","dateModified":"2023-07-24T16:31:51+00:00","mainEntityOfPage":{"@id":"https:\/\/lightning.ai\/pages\/blog\/tabular-classification-with-lightning\/"},"wordCount":2063,"commentCount":0,"publisher":{"@id":"https:\/\/lightning.ai\/pages\/#organization"},"image":{"@id":"https:\/\/lightning.ai\/pages\/blog\/tabular-classification-with-lightning\/#primaryimage"},"thumbnailUrl":"https:\/\/lightningaidev.wpengine.com\/wp-content\/uploads\/2023\/07\/Screenshot-2023-07-12-at-7.34.12-PM.png","articleSection":["Blog","Tutorials"],"inLanguage":"en-US","potentialAction":[{"@type":"CommentAction","name":"Comment","target":["https:\/\/lightning.ai\/pages\/blog\/tabular-classification-with-lightning\/#respond"]}]},{"@type":"WebPage","@id":"https:\/\/lightning.ai\/pages\/blog\/tabular-classification-with-lightning\/","url":"https:\/\/lightning.ai\/pages\/blog\/tabular-classification-with-lightning\/","name":"Tabular Classification with Lightning - Lightning AI","isPartOf":{"@id":"https:\/\/lightning.ai\/pages\/#website"},"primaryImageOfPage":{"@id":"https:\/\/lightning.ai\/pages\/blog\/tabular-classification-with-lightning\/#primaryimage"},"image":{"@id":"https:\/\/lightning.ai\/pages\/blog\/tabular-classification-with-lightning\/#primaryimage"},"thumbnailUrl":"https:\/\/lightningaidev.wpengine.com\/wp-content\/uploads\/2023\/07\/Screenshot-2023-07-12-at-7.34.12-PM.png","datePublished":"2023-07-13T12:00:44+00:00","dateModified":"2023-07-24T16:31:51+00:00","breadcrumb":{"@id":"https:\/\/lightning.ai\/pages\/blog\/tabular-classification-with-lightning\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/lightning.ai\/pages\/blog\/tabular-classification-with-lightning\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/lightning.ai\/pages\/blog\/tabular-classification-with-lightning\/#primaryimage","url":"https:\/\/lightningaidev.wpengine.com\/wp-content\/uploads\/2023\/07\/Screenshot-2023-07-12-at-7.34.12-PM.png","contentUrl":"https:\/\/lightningaidev.wpengine.com\/wp-content\/uploads\/2023\/07\/Screenshot-2023-07-12-at-7.34.12-PM.png","width":1704,"height":844},{"@type":"BreadcrumbList","@id":"https:\/\/lightning.ai\/pages\/blog\/tabular-classification-with-lightning\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/lightning.ai\/pages\/"},{"@type":"ListItem","position":2,"name":"Tabular Classification with Lightning"}]},{"@type":"WebSite","@id":"https:\/\/lightning.ai\/pages\/#website","url":"https:\/\/lightning.ai\/pages\/","name":"Lightning AI","description":"The platform for teams to build AI.","publisher":{"@id":"https:\/\/lightning.ai\/pages\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/lightning.ai\/pages\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/lightning.ai\/pages\/#organization","name":"Lightning AI","url":"https:\/\/lightning.ai\/pages\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/lightning.ai\/pages\/#\/schema\/logo\/image\/","url":"https:\/\/lightningaidev.wpengine.com\/wp-content\/uploads\/2023\/02\/image-17.png","contentUrl":"https:\/\/lightningaidev.wpengine.com\/wp-content\/uploads\/2023\/02\/image-17.png","width":1744,"height":856,"caption":"Lightning AI"},"image":{"@id":"https:\/\/lightning.ai\/pages\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/x.com\/LightningAI"]},{"@type":"Person","@id":"https:\/\/lightning.ai\/pages\/#\/schema\/person\/2518f4d5541f8e98016f6289169141a6","name":"JP Hennessy","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/lightning.ai\/pages\/#\/schema\/person\/image\/","url":"https:\/\/secure.gravatar.com\/avatar\/28ade268218ae45f723b0b62499f527a?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/28ade268218ae45f723b0b62499f527a?s=96&d=mm&r=g","caption":"JP Hennessy"},"url":"https:\/\/lightning.ai\/pages\/author\/jplightning-ai\/"}]}},"_links":{"self":[{"href":"https:\/\/lightning.ai\/pages\/wp-json\/wp\/v2\/posts\/5648375"}],"collection":[{"href":"https:\/\/lightning.ai\/pages\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/lightning.ai\/pages\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/lightning.ai\/pages\/wp-json\/wp\/v2\/users\/16"}],"replies":[{"embeddable":true,"href":"https:\/\/lightning.ai\/pages\/wp-json\/wp\/v2\/comments?post=5648375"}],"version-history":[{"count":0,"href":"https:\/\/lightning.ai\/pages\/wp-json\/wp\/v2\/posts\/5648375\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/lightning.ai\/pages\/wp-json\/wp\/v2\/media\/5648381"}],"wp:attachment":[{"href":"https:\/\/lightning.ai\/pages\/wp-json\/wp\/v2\/media?parent=5648375"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/lightning.ai\/pages\/wp-json\/wp\/v2\/categories?post=5648375"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/lightning.ai\/pages\/wp-json\/wp\/v2\/tags?post=5648375"},{"taxonomy":"glossary","embeddable":true,"href":"https:\/\/lightning.ai\/pages\/wp-json\/wp\/v2\/glossary?post=5648375"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}