← Back to Glossary
Tokenisation (AI)
Definition
The process of breaking text, code, or other sequential data into discrete units (tokens) that serve as the input and output elements for large language models. Tokenisation determines how a model processes language and directly affects inference costs, since API pricing for large language models is typically based on token count. Different tokenisation schemes handle multilingual content with varying efficiency.
Related Terms
Put this knowledge to work
Use Opagio's free tools to measure and grow the intangible assets that drive your business value.