Skip to content

Releases: Lightning-AI/pytorch-lightning

New features

12 Aug 20:11
Compare
Choose a tag to compare
  • validation_step, val_dataloader are now optional.
  • enabled multiple dataloaders for validation.
  • support for latest test-tube logger optimized for PT 1.2.0.
  • lr_scheduler now activated after epoch

Stable fully-featured release

08 Aug 16:39
Compare
Choose a tag to compare

0.4.0

0.4.0 is the first public release after a short period testing with public users. Thanks for all the help ironing out bugs to get Lightning to run on everything from notebooks to local to server machines.

This release includes:

  • Extensively tested code.
  • Cleaner API to accommodate the various research use cases

New features

  • No need for experiment object in trainer.
  • Training continuation (not just weights, but also epoch, global step, etc...)
    • if the folder the checkpoint callback uses has weights, it loads the last weights automatically.
  • training step and validation step don't reduce outputs automatically anymore. This fixes issues with reducing generated outputs for example (images, text).
  • 16-bit can now be used with a single GPU (no DP or DDP in this case). bypasses issue with NVIDIA apex and PT compatibility for DP+16-bit training.

Simple data loader

25 Jul 17:28
Compare
Choose a tag to compare

Simplified data loader.

Added a decorator to do lazy loading internally:

Old:

@property
def tng_dataloader(self):
      if self._tng_dataloader is None:
               self._tng_dataloader = DataLoader(...)
      return self.tng_dataloder

Now:

@ptl.data_loader
def tng_dataloader(self):
      return DataLoader(...)

Tests!

25 Jul 02:09
Compare
Choose a tag to compare

Fully tested!

Includes:

  • Code coverage (99%)
  • Full tests that run multiple models in different configs
  • Full tests that test specific functionality in trainer.