Highlights §
- One of the most significant issues with PyTorch is that one has to manually write its long training loops shown below, which is primarily boilerplate code. (View Highlight)
- You can think of PyTorch Lightning as a lightweight wrapper around PyTorch. (View Highlight)
- Just like Keras is a wrapper on TensorFlow, PyTorch lightning is a wrapper on PyTorch, but one that makes it much more efficient than the traditional way of training the model. (View Highlight)
- PyTorch Lightning:
• Abstracts away the boilerplate code, which we typically write with PyTorch
• Provides elegant and one-liner support for mixed precision training.
• Works seamlessly in a distributed setting, again, with just a few lines of code.
• Comes with built-in logging and profiling capabilities, and much more. (View Highlight)