ABCDEFGHIJKLMNOPQRSTUVWXYZ
GLOSSARY

Context Window

DEFINITION

The maximum amount of text (in tokens) that a language model can process in one prompt-response cycle. Both input (prompt) and output (completion) count against the limit.

The context window is the working memory of an LLM. Early models had 4K-token windows; modern models range from 16K (GPT-3.5-turbo) to 2M tokens (Gemini 1.5 Pro). Claude supports 200K tokens.

Context window size matters most for: analyzing entire codebases, processing long documents, extended multi-turn conversations, and complex multi-step reasoning chains where earlier context informs later decisions.

Tools That Use Context Window

C
Claude
9.4/10

Anthropic's AI assistant with industry-leading reasoning and safety

Free / $20/mo Pro / API from $3/M tokensView Review →
G
Gemini
8.6/10

Google's multimodal AI assistant with deep Search and Workspace integration

Free / $20/mo Advanced / API from $0.35/M tokensView Review →