Acquire Public AI Artifacts
This technique has been observed in real-world attacks on AI systems.
Adversaries may search public sources, including cloud storage, public-facing services, and software or data repositories, to identify AI artifacts. These AI artifacts may include the software stack used to train and deploy models, training and testing data, model configurations and parameters. An adversary will be particularly interested in artifacts hosted by or associated with the victim organization as they may represent what that organization uses in a production environment. Adversaries may identify artifact repositories via other resources associated with the victim organization (e.g. [Search Victim-Owned Websites](/techniques/AML.T0003) or [Search Open Technical Databases](/techniques/AML.T0000)). These AI artifacts often provide adversaries with details of the AI task and approach.
AI artifacts can aid in an adversary's ability to [Create Proxy AI Model](/techniques/AML.T0005). If these artifacts include pieces of the actual model in production, they can be used to directly [Craft Adversarial Data](/techniques/AML.T0043). Acquiring some artifacts requires registration (providing user details such email/name), AWS keys, or written requests, and may require the adversary to [Establish Accounts](/techniques/AML.T0021).
Artifacts might be hosted on victim-controlled infrastructure, providing the victim with some information on who has accessed that data.