Token Count
Side-by-side text diff with token counts across 9 LLM tokenizers
Paste text and see how nine different LLM tokenizers split it into tokens, side by side. Useful for understanding why the same prompt costs different amounts across models, or for spotting where tokenizers disagree on word boundaries.
Hosted separately at tokencount.eordano.com.