Skip to content

Hide Navigation Hide TOC

Poison ML Model (e0eb2b64-aebd-4412-80f3-b71d7805a65f)

Adversaries may introduce a backdoor by training the model poisoned data, or by interfering with its training process. The model learns to associate an adversary-defined trigger with the adversary's desired output.

Cluster A Galaxy A Cluster B Galaxy B Level
Backdoor ML Model (c704a49c-abf0-4258-9919-a862b1865469) MITRE ATLAS Attack Pattern Poison ML Model (e0eb2b64-aebd-4412-80f3-b71d7805a65f) MITRE ATLAS Attack Pattern 1