Quantization Basics

Easy100 pts0 solves
A 70B parameter model in fp16 requires ~140GB of VRAM. Quantizing to 4-bit reduces this to ~35GB. What is quantization? Flag format: CONGRESS{definition_in_snake_case}
Hint
Use fewer bits to represent each number, trading tiny quality loss for huge memory savings.