Depending on your organization's risk tolerance, you may want to apply things that you're not required to. So, even if you're not subject to the EU AI Act, for example, there are multiple best practices that you can draw from.
AI used in a military context, including national security and defense
AI used in research and development, including in the private sector
AI used by public authorities in third countries and international organizations under international agreements for law enforcement or judicial cooperation
Require compliance with specific articles in the Act, including implementing a risk management system, managing data and data governance, monitoring performance and safety, registering in a public EU database, and developing the system to allow for human oversight
Requirements for deployers, importers and distributors of high-risk AI systems
Complete a fundamental rights impact assessment before putting the system into use, verify compliance with the Act, communicate with the provider and regulator as required, ensure the conformity assessment has been completed, monitor the system and suspend use if serious issues occur, maintain logs, assign human oversight, and cooperate with regulators
Primary compliance focuses on transparency, such as informing people they are interacting with an AI system and disclosing and labeling deepfake content
Highest penalty is on prohibited AI: up to tens of millions of euros or percentage of global turnover for the preceding fiscal year, whichever is higher
Penalty for most instances will be lower than for prohibited AI, but also goes up to tens of millions of euros or percentage of global turnover for the preceding fiscal year, whichever is higher
More proportionate caps on fines for startups and small/medium-sized enterprises
The Act should apply two years after it comes into effect
Highest penalty is on prohibited AI: up to tens of millions of euros or percentage of global turnover for the preceding fiscal year, whichever is higher
Penalty for most instances will be lower than for prohibited AI, but also goes up to tens of millions of euros or percentage of global turnover for the preceding fiscal year, whichever is higher
The EU AI Pact is an interim step where the European Commission began initiating a voluntary commitment for industry to begin complying with EU AI Act requirements before legal enforcement begins
The EU AI Pact involves industry participants taking a pledge and meeting with non-industry organizations to agree on best practices to observe in the interim
Preparing for potential regulatory oversight ahead of time minimizes the need for major adjustments after programs and policies are in place. Having a legal advisor who is aware of and staying informed about upcoming laws is critical to ensure compliance is achievable.
There are many frameworks available that can help your organization design an AI risk model that is appropriate for the AI use intended. Knowing about these frameworks will allow AI governance professionals to help select aspects from each that appropriately address their organization's AI use.