Training Data Poisoning
Hard200 pts0 solves
An attacker adds poisoned examples to an open dataset. The model learns a hidden behavior activated by a trigger phrase.
What was planted?
Flag format: CONGRESS{[what_was_planted]}
Example: CONGRESS{corrupted_weight_file}
Hint
The attack is in the data, not the code. A specific pattern triggers the malicious behavior.