Tuesday, April 23, 2024

LLMs - words vs tokens

https://kelvin.legal/understanding-large-language-models-words-versus-tokens/#:~:text=The%20size%20of%20text%20an,the%20cost%20and%20vice%20versa.

Tokens can be thought of as pieces of words. Before the model processes the prompts, the input is broken down into tokens. These tokens are not cut up exactly where the words start or end - tokens can include trailing spaces and even sub-words. -- Llama.

The size of text an LLM can process and generate is measured in tokens. Additionally, the operational expense of LLMs is directly proportional to the number of tokens it processes - the fewer the tokens, the lower the cost and vice versa. 






Tokenizing language translates it into numbers – the format that computers can actually process. Using tokens instead of words enables LLMs to handle larger amounts of data and more complex language. By breaking words into smaller parts (tokens), LLMs can better handle new or unusual words by understanding their building blocks.


No comments:

Post a Comment

SQL Essential Training - LinkedIn

Datum - piece of information Data is plural of datum. Data are piece of information - text, images or video. Database - collection of data. ...