Highlights
- 2 discussions answered
Popular repositories
-
-
Speech Recognition using DeepSpeech2 network and the CTC activation function.
-
-
Forked from williamFalcon/minGPT
A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) training
-
1,871 contributions in the last year
Less
More
Contribution activity
July 2021
Created 14 commits in 4 repositories
Created a pull request in PyTorchLightning/pytorch-lightning that received 11 comments
Add Windows Support for DeepSpeed
What does this PR do? Closes #6651. To test this, I think I'll have to mock a bunch of things and ensure initialization works on Windows. Fixes #<i…
+66
−2
•
11
comments
Opened 12 other pull requests in 3 repositories
PyTorchLightning/pytorch-lightning
6
merged
2
open
- [Fix] Remove DeepSpeed Plugin FP16 exception
- Fix save/load/resume from checkpoint for DeepSpeed Plugin
- Load ckpt path when model provided in validate/test/predict
- [docs] Add NCCL environment variable docs
- [docs] Add docs for DeepSpeed Infinity
- [docs] Remove RC candidate install for DeepSpeed
- [IPU] Fix Custom Poptorch options to IPUPlugin
- [IPU] Allow poptorch.Options to override Trainer
PyTorchLightning/lightning-flash
3
merged
PyTorchLightning/lightning-bolts
1
merged
Reviewed 48 pull requests in 4 repositories
PyTorchLightning/pytorch-lightning 43 pull requests
- docs: explain how Lightning uses closures for automatic optimization
- Connect the model to the training type plugin at the start of run
- Fix DeepSpeed lr scheduler logic
- [bugfix] Re-compute accumulated_grad_batches
- [bugfix] Reduce memory leaks
- Fix log_dir tracking in case of multiple Trainer instances + DDP
- Add Windows Support for DeepSpeed
- Load ckpt path when model provided in validate/test/predict
- Loop specialization
- [Feat] Add utilities for CombinedLoader state dict and dataloader state dict 1/n
- Add progress tracking on Loops - 2/n
-
Replace
yapfwithblack - Add support for devices flag to Trainer
- Fix save/load/resume from checkpoint for DeepSpeed Plugin
-
Add
ModelCheckpoint(save_on_train_epoch_end) - Add logger flag to save_hyperparameters
-
every_n_val_epochs->every_n_epochs - Remove Vulture ⚰️
- Clean code formatting CI job
- CI: support PT 1.10
- [Refactor] Improve loops API 1/n
- Add support for (accelerator='cpu'|'gpu'|'tpu'|'ipu'|'auto')
-
Add the
on_before_optimizer_stephook - Only output IPU report on request
- Unpin Pillow after the 8.3.1 release
- Some pull request reviews not shown.
PyTorchLightning/lightning-flash 3 pull requests
PyTorchLightning/metrics 1 pull request
PyTorchLightning/lightning-transformers 1 pull request
Created an issue in pytorch/ort that received 3 comments
PyTorch Lightning Integration
Hey guys! Really epic work in this repo! I'm currently working on integrating this into Lightning (any assistance would be appreciated). From what …
3
comments
Opened 1 other issue in 1 repository
PyTorchLightning/pytorch-lightning
1
open
Answered 1 discussion in 1 repository
PyTorchLightning/pytorch-lightning
PyTorchLightning/pytorch-lightning