International models use tiktoken. Domestic options add weighted Han/other heuristics (Qwen/GLM/ERNIE-style) and cl100k/o200k BPE proxies—all computed locally; numbers are indicative only.
OpenAI entries use tiktoken and match official tokenization. Domestic “heuristics” are weighted estimates, not true tokenizer output; cl100k/o200k options use the same BPE tables as GPT where vendors cite similar vocab. Billing may still differ (message wrapping, special tokens, tools). Avoid pasting secrets on shared devices.