34 min listen
Efficient Guided Generation for Large Language Models
Efficient Guided Generation for Large Language Models
ratings:
Length:
21 minutes
Released:
Aug 27, 2023
Format:
Podcast episode
Description
In this article we show how the problem of neural text generation can be constructively reformulated in terms of transitions between the states of a finite-state machine. This framework leads to an efficient approach to guiding text generation with regular expressions and context-free grammars by allowing the construction of an index over a language model's vocabulary. The approach is model agnostic, allows one to enforce domain-specific knowledge and constraints, and enables the construction of reliable interfaces by guaranteeing the structure of the generated text. It adds little overhead to the token sequence generation process and significantly outperforms existing solutions. An implementation is provided in the open source Python library Outlines
2023: Brandon T. Willard, Rémi Louf
https://arxiv.org/pdf/2307.09702v4.pdf
2023: Brandon T. Willard, Rémi Louf
https://arxiv.org/pdf/2307.09702v4.pdf
Released:
Aug 27, 2023
Format:
Podcast episode
Titles in the series (100)
Editing Large Language Models: Problems, Methods, and Opportunities: Recent advancements in deep learning have precipitated the emergence of large language models (LLMs) which exhibit an impressive aptitude for understanding and producing text akin to human language. Despite the ability to train highly capable LLMs, t... by Papers Read on AI