Common Information
Type | Value |
---|---|
Value |
Backdoor ML Model |
Category | Attack-Pattern |
Type | Mitre-Atlas-Attack-Pattern |
Misp Type | Cluster |
Description | Adversaries may introduce a backdoor into a ML model. A backdoored model operates performs as expected under typical conditions, but will produce the adversary's desired output when a trigger is introduced to the input data. A backdoored model provides the adversary with a persistent artifact on the victim system. The embedded vulnerability is typically activated at a later time by data samples with an [Insert Backdoor Trigger](/techniques/AML.T0043.004) |