Tag: GPT-2

How to generate text: using different decoding methods for language generation with Transformers

How to generate text: using different decoding methods for language generation with Transformers

How to generate text: using different decoding methods for language generation with Transformers How to generate text: using different decoding methods for language generation with Transformers This blog post gives a brief overview of different …

minGPT

minGPT

minGPT karpathy/minGPT A PyTorch re-implementation of GPT training. minGPT tries to be small, clean, interpretable and educational, as most of the currently available ones are a bit sprawling. GPT is not a complicated model and this implementat …

The Annotated GPT-2

The Annotated GPT-2

The Annotated GPT-2 The Annotated GPT-2 The GPT-2 might seem like magic at first with all it’s glitter and beauty too, but hopefully I would have uncovered that magic for you and revealed all the tricks by the time you finish reading this post. …

How To Make Custom AI-Generated Text With GPT-2

How To Make Custom AI-Generated Text With GPT-2

How To Make Custom AI-Generated Text With GPT-2 How To Make Custom AI-Generated Text With GPT-2 Thanks to gpt-2-simple and this Colaboratory Notebook, you can easily finetune GPT-2 on your own dataset! Source: minimaxir.com/2019/09/howto-gpt2/ …

GPT-2: 1.5B Release

GPT-2: 1.5B Release

GPT-2: 1.5B Release GPT-2: 1.5B Release As the final model release of GPT-2’s staged release, we’re releasing the largest version (1.5B parameters) of GPT-2 along with code and model weights to facilitate detection of outputs of GPT-2 models. …

Subscribe to our Digest