Skip to content
#

pytorch

Here are 7,579 public repositories matching this topic...

transformers
patrickvonplaten
patrickvonplaten commented Sep 11, 2020

🚀 Feature request

Currently we have a mixture of negative and positive formulated arguments, e.g. no_cuda and training here: https://github.com/huggingface/transformers/blob/0054a48cdd64e7309184a64b399ab2c58d75d4e5/src/transformers/benchmark/benchmark_args_utils.py#L61.

We should change all arguments to be positively formulated, *e.g. from no_cuda to cuda. These arguments should

hellock
hellock commented Jun 7, 2020

We keep this issue open to collect feature requests from users and hear your voice. Our monthly release plan is also available here.

You can either:

  1. Suggest a new feature by leaving a comment.
  2. Vote for a feature request with 👍 or be against with 👎. (Remember that developers are busy and cannot respond to all feature requests, so vote for your most favorable one!)
  3. Tell us that
pytorch-lightning
ananthsub
ananthsub commented Sep 16, 2020

🚀 Feature

Enable training purely based on number of iterations instead of epochs

Motivation

This can be useful for certain training runs. Without this feature, the user must set an unreachably high value for max_epochs and set max_steps to the desired iteration count. With this setup, the trainer will break from the training loop based on max_steps since we'd never reach `max_e

nni
mileslucas
mileslucas commented Dec 19, 2018

To begin I tried logging in with GitHub and also creating an account on the pyro forums, but neither of those is working.

Problem

I need to fit a batch of four independent Gaussian Processes and I don't want to have to use for loops for fitting each one. The current GP's are able to broadcast properly to my outputs, but I can't batch them so that the inputs are independent.

My input d

Improve this page

Add a description, image, and links to the pytorch topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the pytorch topic, visit your repo's landing page and select "manage topics."

Learn more

You can’t perform that action at this time.