Transformers

From Open Source Ecology
Jump to navigation Jump to search

Transformers compute relationships between tokens, and the tokenization scheme must reflect the natural structure of the input data so that meaningful relationships can be learned.


token is the basic unit of data that a language model processes. It is the smallest chunk of input that the model reads and reasons about.

Importantly, a token is not necessarily a whole word