On May 12, 2026, the Colorado legislature passed SB 26-189, which repeals and replaces its landmark Artificial Intelligence Act. Colorado is doing away with the concept of “algorithmic discrimination” and moving instead to a notice- and disclosure-based regime focused on automated decision-making. This time, it does so without a carve-out for deployers who are small businesses.
On May 17, 2024, Colorado became the first state in the nation to enact a comprehensive AI law. The Colorado Artificial Intelligence Act (“AIA”) regulated developers and deployers of “high-risk” AI systems—i.e., AI that makes, or is a “substantial factor” in making, consequential decisions in education, employment, financial services, healthcare, housing, insurance, or other high-impact areas. Developers and deployers were required to use reasonable care to protect consumers from algorithmic discrimination, and the AIA imposed a proactive compliance regime to effectuate that duty—including documentation, disclosures, risk management programs, impact assessments, and reporting obligations.
But after passage, the AIA began to face obstacles on the way to implementation. In August 2025, the Colorado legislature postponed the AIA’s compliance deadline from February 1, 2026, to June 30, 2026. As we noted in our advisory at the time, the AIA had drawn criticism from the business community since its enactment, and both Governor Polis and Attorney General (“AG”) Weiser urged lawmakers to amend the law to mitigate what they saw as burdensome requirements. The White House specifically referenced the AIA as a “burdensome” and potentially unconstitutional AI regulation in an executive order. And in April 2026, AI company xAI filed a constitutional suit in Colorado federal court seeking to enjoin enforcement of the AIA principally on First Amendment grounds. The U.S. Department of Justice moved to intervene in the case, raising challenges under the Equal Protection Clause. As a result of this litigation, Colorado AG Weiser agreed to delay AIA enforcement until either rulemaking under the AIA was complete or the Colorado legislature passed amendments to the AIA.
Rather than simply amending the AIA, on May 12, 2026, the Colorado legislature passed SB 26-189, which repeals and replaces the AIA in its entirety and, if signed, would come into force on January 1, 2027. SB 26-189 (the “Restated AIA”) does away with the former concept of “algorithmic discrimination,” the duty to avoid algorithmic discrimination, and the duty to report instances of algorithmic discrimination, and instead shifts to a notice- and disclosure-based regime focused on deployer obligations—while still retaining consumers’ ability to sue if “covered” AI systems end up operating in an unlawfully discriminatory fashion. This blog post provides a brief overview of key provisions in the Restated AIA. Note, however, that the Restated AIA contains a complex web of cross-referencing and overlapping definitions relevant to what AI systems are ultimately “covered,” and its precise scope may have to be clarified via rulemaking or litigation. Key provisions in the Restated AIA include:
- Scope. Rather than regulate “high-risk” AI systems, the Restated AIA regulates “Covered ADMT” (“Automated Decision-Making Technology”), which is ADMT that “materially influences” consequential decisions in covered domains. “Covered domains” are defined as:
- Education enrollment or opportunity;
- Employment or an employment opportunity that creates or may create an employer-employee relationship;
- Lease or purchase of residential real estate in Colorado;
- Financial or lending services;
- Insurance, including underwriting, pricing, coverage, claims adjudication, or other determinations that materially affect access to benefits;
- Healthcare services; and
- Essential government services and public benefits, including eligibility and renewal determinations.
These “covered domains” are broadly consistent with the “high risk” areas of the former AIA.
- Developer Obligations. The Restated AIA streamlines the documentation that developers must provide to deployers into a form similar to a “model card.” Unlike the AIA, there is no public disclosure requirement, but rather an option to provide model documentation via public release notes (as long as deployers also receive direct notice when such public release notes are updated). The documentation for Covered ADMT must state the following:
- The intended uses of the Covered ADMT;
- A description of the categories of training data used (including personal data);
- The known limitations of the Covered ADMT;
- Instructions for appropriate use, monitoring, and meaningful human review; and
- Other information reasonably necessary for deployers to comply with their obligations under the Restated AIA.
- Deployer Obligations. The Restated AIA places most of its key obligations on deployers of covered AI systems. On the one hand, it eliminates the statutory requirement for deployers to institute risk management programs and conduct impact assessments (although companies may well continue doing AI assessments even without a statutory mandate). On the other hand, the Restated AIA retains, with adjustments, both the requirement to provide pre-decision notice and the requirement to provide “adverse action” disclosures to consumers. These are:
- The “We use Covered AI” Notice: Before using Covered ADMT to materially influence a consequential decision, deployers must provide a “clear and conspicuous” notice (a “prominent public notice that is reasonably accessible at points of consumer interaction”). The notice must state that Covered ADMT is being used in making a consequential decision and provide instructions for how consumers can obtain the additional information required by the Restated AIA.
- “Adverse Action” Notice: Deployers must provide consumers with a notice within 30 days of a consequential decision that results in an adverse outcome. The Restated AIA does not specify what the complete contents of this notice must be and directs the AG to issue rules clarifying mandatory content. Currently, these “adverse action” notices must include a plain-language description of the consequential decision and the Covered ADMT’s role in such decision; instructions and the process to request additional information about the Covered ADMT; and an explanation of the consumer’s rights and the method to exercise these rights (discussed below). Note that if a consumer requests further information about which ADMT was used, the Restated AIA currently requires deployers to provide the name of the specific AI system they used, as well as the name of the developer.
- Consumer Rights: The Restated AIA adds a standalone consumer rights section that requires deployers to enable consumers to exercise two rights: (a) access to and correction of certain personal data used in consequential decisions (albeit subject to the limitations and exemptions of Colorado’s Privacy Act); and (b) an opportunity for meaningful human review and reconsideration of the consequential decision, “to the extent commercially reasonable.” The Restated AIA contains a definition of what makes human review “meaningful”—including training human reviewers and ensuring they do not “default” to system outputs—and further detail may come via rulemaking.
- Enforcement. Like the original AIA, the AG has exclusive enforcement authority, and the Restated AIA creates no private right of action. The AG is generally required to provide a 60-day cure period, albeit not for knowing or repeated violations, and also not in situations where the AG deems a cure not “possible.” But the Restated AIA provides more clarity on the scope of liability for private suits. Developers and deployers may be subject to private suits under existing law, including the Colorado Consumer Protection Act and Anti-Discrimination Act, with fault allocated based on relative responsibility. The Restated AIA also seeks to prevent businesses from contracting away liability for discrimination by providing that clauses indemnifying a party for its own acts or omissions related to violations of the Colorado Anti-Discrimination Act are contrary to public policy and void.
We will continue to monitor developments with SB 26-189. For more information on AI legislation, regulations, and enforcement, please contact Alston & Bird’s Privacy, Cybersecurity and Data Strategy Team and sign up for alerts at AlstonPrivacy.com.
