Token
A token is the basic unit of a Large Language Model's vocabulary. It's essentially how a model translates our language – words, characters, and punctuation – into computer language. Tokenization is the process of actually doing that conversion.
