Impersonation
Tactic: Defense Evasion
This technique has been observed in real-world attacks on AI systems.
Adversaries may impersonate a trusted person or organization in order to persuade and trick a target into performing some action on their behalf. For example, adversaries may communicate with victims (via [Phishing](/techniques/AML.T0052), or [Spearphishing via Social Engineering LLM](/techniques/AML.T0052.000)) while impersonating a known sender such as an executive, colleague, or third-party vendor. Established trust can then be leveraged to accomplish an adversary's ultimate goals, possibly against multiple victims.
Adversaries may target resources that are part of the AI DevOps lifecycle, such as model repositories, container registries, and software registries.