• On the WMT-14 French-English test set, GPT-2 is able to leverage its very strong English language model, achieving 11.5 BLEU.
  • Powered by GitBook. GPT-2. ... GPT-2 Visual Explanation. How To GPT-2 in your Computer. GPT-2 Twitter Bot.
  • As the final model release of GPT-2’s staged release, we’re releasing the largest version (1.5B parameters) of GPT-2 along with code and model weights(opens in...
  • Status: Archive (code is provided as-is, no updates expected). gpt-2. Code and models from the paper "Language Models are Unsupervised Multitask Learners".
    • Issues:
      133
  • This is the smallest version of GPT-2, with 124M parameters. Related Models: GPT-Large, GPT-Medium and GPT-XL. Intended uses & limitations.
  • GPT-2 Output Detector Demo. This is an online demo of the GPT-2 output detector model, based on the/Transformers implementation of RoBERTa.
  • GPT-2 is a direct scale-up of GPT, with more than 10X the parameters and trained on more than 10X the amount of data.
  • The GPT-2 Chatbot is believed to be based on a variation of OpenAI's GPT architecture. Exact details on the model specifics and modifications remain under review.
  • Machine learningand data mining. v. t. e. Generative Pre-trained Transformer 2 (GPT-2) is a large language model by OpenAI and the second in their foundational series of GPT...
  • GPT-2 was also released for English, which makes it difficult for someone trying to generate text in a different language.
  • After that, two more seminal papers by OpenAI, GPT-1 and GPT-2 laid the foundation of almost all the NLP innovations to date.
  • GPT-2 has a whopping 1.5 billion parameters (10X more than the original GPT) and is trained on the text from 8 million websites.
  • This profile is currently unclaimed by GPT2. All information listed is provided by G2. Do you work at GPT2? Claim your profile to update your information.