Token Economics

Very Easy50 pts0 solves
Most LLM APIs charge differently for input (prompt) vs output (completion) tokens. Which is typically more expensive? Flag format: CONGRESS{pricing_rule_in_snake_case}
Hint
Generating tokens requires more computation than reading them.