Working from home
Research Engineer at Grid AI
- Delhi, India
Highlights
- 39 discussions answered
Block or Report
Block or report rohitgr7
Report abuse
Contact GitHub support about this user’s behavior. Learn more about reporting abuse.
Report abusePinned
-
The lightweight PyTorch wrapper for high-performance AI research. Scale your models, not the boilerplate.
-
opencv-filters Public
Implementation of image filters with opencv. Contains both image and video filters.
-
1,126 contributions in the last year
Less
More
Activity overview
Contributed to
PyTorchLightning/pytorch-lightning,
PyTorchLightning/metrics,
pytorch/pytorch
and 5 other
repositories
Contribution activity
December 2021
Created 34 commits in 3 repositories
Created 1 repository
- rohitgr7/pytorch C++
Created a pull request in pytorch/pytorch that received 11 comments
update SequentialLR signature
optimizer isn't required for
SequentialLR since it's already present in the schedulers. Trying to match the signature of it with ChainedScheduler.v…
+12
−3
•
11
comments
Opened 30 other pull requests in 3 repositories
PyTorchLightning/pytorch-lightning
10
open
17
merged
1
closed
- Update child modules docs
- Update optimization docs
- Update evaluation docs
- Update transfer learning docs
- Update training tricks docs
- Add LightningLite to README
- Avoid torch amp cuda warning with bf16 on cpu
- Prune EvalModelTemplate (4/4)
- Update quick start docs
- Enable logging hparams only if there are any
- Add BatchSizeFinder callback
-
Add support for returning callback from
LightningModule.configure_callbacks - Update speed docs
- Update data docs
- Move optional extensions section
- Update precision docs
-
Fix support for
CombinedLoaderwhile checking for warning raised with eval dataloaders -
Fix support for logging within callbacks returned from
LightningModule - Fix the num_batches value in error message
- Weekly Patch Release v1.5.6
- Update Changelog after 1.5.5 release
- Prune EvalModelTemplate (3/n)
- Prune EvalModelTemplate (2/n)
- Prune EvalModelTemplate (1/n)
- Prune EvalModelTemplate from tests
- Some pull requests not shown.
wandb/examples
1
open
pytorch/pytorch
1
closed
Reviewed 86 pull requests in 3 repositories
PyTorchLightning/pytorch-lightning
83 pull requests
- include Lezwon in alumni
- Update training tricks docs
- Update precision docs
- Update child modules docs
-
Fix typing in
pl.callbacks.lr_monitor -
Deprecate
TrainerOptimizersMixinand move functionality tocore/optimizer.py - Remove explicit isinstance checks in strategies for checkpoint io
- Update changelog after 1.5.7 release
- Fix master import conflict
- Add example of getting DataLoader from within LightningModule
- Weekly Patch Release v1.5.7
-
Suppress Warning in
PredictionEpochLoop - Rename DeepSpeedPlugin to DeepSpeedStrategy
- Rename HorovodPlugin to HorovodStrategy
- Rename IPUPlugin to IPUStrategy
- Avoid torch amp cuda warning with bf16 on cpu
-
Renamed the
DDPSpawnPlugintoDDPSpawnStrategy -
Rename
ParallelPlugintoParallelStrategy - Drop Python 3.6 support
-
Add
LightningModule.lr_scheduler_step - Update optimization docs
- Update evaluation docs
-
Rename
restore_checkpoint_after_pre_dispatchtorestore_checkpoint_after_setup - Add LightningLite to README
- Update transfer learning docs
- Some pull request reviews not shown.
PyTorchLightning/lightning-flash
2 pull requests
pytorch/pytorch
1 pull request
Created an issue in PyTorchLightning/pytorch-lightning that received 5 comments
Resuming training throws the mid-epoch warning everytime
Proposed refactor
Getting this:
UserWarning: You're resuming from a checkpoint that ended mid-epoch. Training will start from the beginning of the …
5
comments
Opened 3 other issues in 1 repository
PyTorchLightning/pytorch-lightning
3
open
Started 1 discussion in 1 repository
github/feedback
github/feedback
Answered 9 discussions in 1 repository
PyTorchLightning/pytorch-lightning
PyTorchLightning/pytorch-lightning
- Multiple Validation Sets
- Save checkpoints without overwrite
- Can I turn off Validation step when overfit_batches=X?
- Does .predict() also use the best weights?
- Accessing available values to monitor when saving checkpoints
- Is there a way to save only part of the Lightning sub-modules to the checkpoint file?
- Not able to Generate Predictions with Trainer.predict()
- Any guide on how the callbacks and hooks workflow works?
- checkpoint every module in a different ckpt file