Table of Contents
0.2.5.dev0

Home

  • Welcome to ⚡ Lightning Thunder
  • Install
  • Hello World
  • Using examine

Basic

  • Overview
  • Zero to Thunder
  • Thunder step by step
  • The sharp edges
  • Train a MLP on MNIST
  • Thunder Concepts - Trace, BoundSymbol, Symbol and Proxy
  • Hello world ThunderFX
  • FAQ

Intermediate

  • Additional executors
  • Distributed Data Parallel
  • What's next
  • FSDP Under the Hood Tutorial
  • Benchmarking Thunder
  • Introduction
  • Transforms
  • Thunder bindings for Liger operators
  • RoPE
  • Test
  • End to end example

Advanced

  • Inside Thunder
  • Extending Thunder
  • Extend Thunder with CUDA-Python
  • Running our kernel in Thunder
  • Inspect
  • Comparing implementations
  • Summary
  • Defining new Thunder operators
  • Defining custom forward and backward for existing operators
  • Contributing to Thunder

Experimental dev tutorials

  • Extending Thunder

API reference

  • thunder
  • thunder.common
  • thunder.core
  • thunder.clang
  • thunder.examine
  • thunder.distributed
  • thunder.executors
  • thunder.torch
  • thunder.extend
  • thunder.transforms
  • thunder.dynamo
  • thunder.recipes
  • thunder.plugins
  • Overview
  • Team management
  • Production
  • Security
  • Open source
    • Overview
    • PyTorch Lightning
    • Fabric
    • Lit-GPT
    • Torchmetrics
    • Litdata
    • Lit LLaMA
    • Litserve
  • Examples
  • Glossary
  • FAQ
  • Docs >
  • thunder.torch
Shortcuts

thunder.torch¶

A PyTorch dialect in thunder.

torchsymbol(*torchfns[, is_method, ...])

size(a, /[, dim])

rtype:

int | Sequence[int]

to_float(a)

rtype:

Number | TensorProxy

_register_custom_op(custom_op)

Register custom_op()'ed function to Thunder.

Under Construction

Operators¶

Unary¶

Binary¶

Conditional¶

Tensor Creation¶

Shape Operation¶

  • thunder.torch
    • Operators
    • Unary
    • Binary
    • Conditional
    • Tensor Creation
    • Shape Operation

To analyze traffic and optimize your experience, we serve cookies on this site. By clicking or navigating, you agree to allow our usage of cookies. Read PyTorch Lightning's Privacy Policy.