This tutorial is in four parts; they are: • The Core Text Generation Implementation • Contrastive Search: What are the Parameters in Text Generation? • Batch Processing and Padding • Tips for Better Generation Results Let’s start with a basic implementation that demonstrates the fundamental concept.
This post is in six parts; they are: • Traditional vs Neural Approaches • Auto-Complete Architecture • Basic Auto-Complete Implementation • Caching and Batched Input When you type in a word in Google's search bar, such as "machine", you may find some additional words are suggested, such as "learning," to…
This post is divided into three parts; they are: • Understanding Text Embeddings • Other Techniques to Generate Embedding • How to Get a High-Quality Text Embedding? Text embeddings are to use numerical vectors to represent text.
This post is divided into seven parts; they are: - Core Text Generation Parameters - Experimenting with Temperature - Top-K and Top-P Sampling - Controlling Repetition - Greedy Decoding and Sampling - Parameters for Specific Applications - Beam Search and Multiple Sequences Generation Let's pick the GPT-2 model as an…