Transformers

From Open Source Ecology
Revision as of 23:22, 8 March 2026 by Marcin (talk | contribs)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

Transformers compute relationships between tokens, and the tokenization scheme must reflect the natural structure of the input data so that meaningful relationships can be learned.


token is the basic unit of data that a language model processes. It is the smallest chunk of input that the model reads and reasons about.

Importantly, a token is not necessarily a whole word