Transformers: Difference between revisions

From Open Source Ecology
Jump to navigation Jump to search
(Created page with "Transformers compute relationships between tokens, and the tokenization scheme must reflect the natural structure of the input data so that meaningful relationships can be learned.")
 
No edit summary
 
Line 1: Line 1:
Transformers compute relationships between tokens, and the tokenization scheme must reflect the natural structure of the input data so that meaningful relationships can be learned.
Transformers compute relationships between tokens, and the tokenization scheme must reflect the natural structure of the input data so that meaningful relationships can be learned.
token is the basic unit of data that a language model processes.
It is the smallest chunk of input that the model reads and reasons about.
Importantly, a token is not necessarily a whole word

Latest revision as of 23:22, 8 March 2026

Transformers compute relationships between tokens, and the tokenization scheme must reflect the natural structure of the input data so that meaningful relationships can be learned.


token is the basic unit of data that a language model processes. It is the smallest chunk of input that the model reads and reasons about.

Importantly, a token is not necessarily a whole word