Skip to content
#

torch

Here are 688 public repositories matching this topic...

A comprehensive list of Deep Learning / Artificial Intelligence and Machine Learning tutorials - rapidly expanding into areas of AI/Deep Learning / Machine Vision / NLP and industry specific areas such as Climate / Energy, Automotives, Retail, Pharma, Medicine, Healthcare, Policy, Ethics and more.

  • Updated May 26, 2022
  • Python
lan2720
lan2720 commented May 20, 2022

Describe the bug
A clear and concise description of what the bug is.

To Reproduce
Steps to reproduce the behavior:

from transformers import BertModel
from torchinfo import summary

bert_base_path =  '/my/local/path/to/bert-base-chinese'

teacher_encoder = BertModel.from_pretrained(bert_base_path)
teacher_encoder.config.output_hidden_states = True
# teacher_encoder.config.
good first issue
pykeen
cthoyt
cthoyt commented May 9, 2021

PyKEEN currently implements two training loops:

  1. The local closed world assumption (LCWA). People often mistakenly call this the closed world assumption)
  2. The stochastic closed world assumption (sLCWA). People often mistakenly call this the open world assumption)

Training for the link prediction task under the open world assumption is mostly nonsensical (some negatives are necessa

qarthandgi
qarthandgi commented Sep 17, 2020

Hey @OPHoperHPO, first want to say thank you for the amazing amount of work that you've put into this! I had two questions for you in regards to this amazing journey you've embarked on.

  1. What are your long term intentions with this project, and do you have a rough timeline for new releases/features?
  2. Do you have a specific EC2 instance type that you recommend to use with this workload?
good first issue question

Improve this page

Add a description, image, and links to the torch topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the torch topic, visit your repo's landing page and select "manage topics."

Learn more