Model Merging

Hard200 pts0 solves
You have a code-tuned Llama and a chat-tuned Llama. Instead of choosing, you blend both into one model. What is the technique? Flag format: CONGRESS{[technique_description]} Example: CONGRESS{train_single_model_on_both}
Hint
Mathematically combine the weight tensors of multiple fine-tuned models.