← All terms

Definition

Token

The basic unit of text that language models process — roughly corresponding to a word or word fragment — used to measure input length, output length, and API costs.

In Depth

Tokens are how LLMs see text. A token might be a whole word ('hello'), a word fragment ('un' + 'break' + 'able'), or a special character. English text averages about 1.3 tokens per word. Understanding tokens is important for agent builders because: context windows are measured in tokens, API pricing is per-token, and token limits constrain how much information an agent can process at once. Efficient agent design minimizes unnecessary token usage while ensuring the model has enough context to make good decisions.

Build production AI agents with EigenForge

Join the Waitlist