California Releases First Drafts of Rules on AI Technology, Cybersecurity Audits
The California Privacy Protection Agency’s draft regulations on automated decision-making technology are likely the first of many comprehensive state-level regulations regarding artificial intelligence and similar automated decision-making technologies.
Although the final rulemaking process has not yet begun, as currently written, the regulations would require businesses that use automated decision-making technology to 1) provide consumers with prior notice of the use; 2) allow consumers to opt out of such use in certain situations; and 3) provide consumers with the right to access the information regarding the business’s use of automated decision-making technology.
The Agency also released draft regulations on cybersecurity requirements and audits, as well as a few potential amendments to existing CCPA regulations. The proposed rules and amendments are still in draft form, and further changes may be made prior to the start of the formal rulemaking process.
On November 27, 2023, the California Privacy Protection Agency (“CPPA”), the agency responsible for implementing and enforcing the California Consumer Privacy Act (the “CCPA”) released an initial draft of its regulations on automated decision-making technology (“ADMT”). ADMT, also commonly referred to as AI or artificial intelligence technology, is defined in the regulations as any system, software, or process, including one derived from machine learning, statistics, and other data-processing or artificial intelligence that processes personal information and uses computation as whole or part of a system to make or execute a decision or facilitate human decision-making. ADMT expressly includes profiling activities. The proposed rules address three key areas of a business’s use of ADMT: 1) notice to consumers regarding the use of ADMT; 2) consumers’ right to opt out of some use cases of ADMT; and 3) consumers’ right to access the information regarding the business’s use of ADMT. The CPPA Board met in December 2023 to discuss the draft rules, marking an important step toward the adoption of the regulations. However, further drafts may still be released prior to the start of the formal rulemaking process.
The proposed rules would require businesses that utilize ADMT to provide consumers with information about how their personal information is used, their right to opt out of some uses of ADMT, and their right to access information about the business’s use of ADMT. This notice must be provided prior to the use of the ADMT and must describe the purpose for which ADMT is used with enough detail to allow consumers to understand the specific use case. A generic description of a purpose such as “to improve our services” would not be acceptable.
The notice must also include a simple method by which the consumer can obtain additional information about the business’s use of ADMT, such as a layered notice or hyperlink. The additional notice must include a plain language description of 1) the logic used in the ADMT, including the key parameters that affect the output of the ADMT; 2) the intended output of the ADMT (e.g., a numerical score of compatibility); 3) how the business plans to use the output to make a decision, including the role of any human involvement; and 4) whether the business’s use of the ADMT has been evaluated for validity, reliability, and fairness, and the outcome of any such evaluation.
The notice must be readily available in a place where customers will encounter it and must be provided in the manner in which the business primarily interacts with the consumer. This notice must be provided before the business processes the consumer’s personal information using the automated decision-making technology.
Under the proposed rule, consumers would have a right to opt out of the business’s use of ADMT for 1) decisions that produce legal or similarly significant effects for the consumer (defined as any decision that results in access to, or the provision or denial of, financial or lending services, housing, insurance, education enrollment or opportunity, criminal justice, employment or independent contracting opportunities or compensation, healthcare services, or essential goods or service); 2) profiling of the consumer when the consumer is acting in their capacity as an employee, independent contractor, job applicant, or student; and 3) profiling of the consumer while they are in a publicly accessible place. Businesses that use ADMT for any of these purposes would be required to provide at least two methods by which consumers can submit an opt-out request. Upon receiving an opt-out request, the business must discontinue the processing of the consumer’s personal information using ADMT as soon as possible, but no later than 15 business days after the date the request was received. The business must also notify its service providers, contractors, or other third parties to whom the business provided the personal information using the ADMT of the opt-out request and instruct them to comply with the request within the same timeframe. The use of ADMT to profile a consumer for behavioral advertising is excluded from the opt-out exceptions and a business must always allow a consumer to opt out of such use of ADMT.
Opt-out rights would not apply to ADMT that is used solely to prevent, detect, and investigate security incidents; to resist fraudulent or illegal actions directed at the business and to prosecute those responsible for those actions; to protect the life and physical safety of consumers; or to provide the goods or perform the services specifically requested by the consumer.
For ADMT to provide a good or service to be excluded from the scope of opt-out rights, the business must not have a reasonable alternative method of processing other than using the ADMT. The proposed rules establish a rebuttable presumption that the business has a reasonable alternative method of processing if there is an alternative method of processing that is or has been used in the business’s industry or similar industries to provide a similar good or perform a similar service. The business can rebut the presumption by demonstrating that it would be futile for the business to develop or use alternative processing methods, developing and using an alternative method would result in a good or service that is not as valid, reliable, and fair, or that an alternative method of processing would impose extreme hardship upon the business, considering the business’ overall financial resources and the nature and structure of its operation, including the business’s technical capabilities.
Consumers would also be provided with a right to access information regarding a business’s use of ADMT in connection with their personal information. If the business uses ADMT for any of the purposes for which consumers have an opt-out right as described above, the business must provide consumers with information about its use of the ADMT. Once the business has verified the identity of the consumer making the request, the business must provide plain language explanations of the purpose for which the ADMT is used, the output of the ADMT with respect to the consumer, how the output was or will be used to make a decision with respect to the consumer, how the ADMT works with respect to the consumer, and how the consumer can obtain the range of possible outputs, which may include aggregate output statistics.
The explanation of the use or planned use of the output must also address any factors other than the output that the business plans to use to make the decision, the role of any human involvement in the business’s use of the ADMT, whether the business’s use of the ADMT has been evaluated for validity, reliability, and fairness, and the outcome of any such evaluation. When explaining how ADMT worked with respect to the consumer, the business must specify how the logic, including its assumptions and limitations, was applied to the consumer and the key parameters that affected the output of the ADMT.
If the business used ADMT for a decision that produced legal or similarly significant effects for a consumer, such as denying an employment opportunity or lowering compensation, the business must notify the consumer of the following: 1) that the business made a decision with respect to the consumer; 2) that the consumer has a right to access information about the business’s use of that ADMT; 3) how the consumer can exercise their access right; and 4) that the consumer can file a complaint with the CPPA and the California Attorney General. The business also shall provide links to the complaint forms on their respective websites.
There are likely many details of the proposed rules that will be modified prior to implementation. The formal rulemaking process is expected to begin sometime this year. The regulatory drafting team has been directed to consult with individual Board members to present a new draft of the ADMT rules at a future Board meeting. The next draft is likely to include substantive changes.
Cybersecurity Requirements and Audits
The CPPA Board also released an updated draft of its proposed cybersecurity audit regulations. Some key features of the latest draft are as follows:
A business will be required to complete a cybersecurity audit if it either a) derives 50% or more of its annual revenues from selling or sharing consumers’ personal information; or b) as of January 1 of the calendar year, had annual gross revenues in excess of $25,000,000 in the preceding calendar year and either 1) processed personal information of 250,000 or more California consumers or households in the preceding calendar year; 2) processed the sensitive personal information of 50,000 or more California consumers in the preceding calendar year; or 3) processed the personal information of 50,000 or more California consumers that the business had actual knowledge were less than 16 years old in the preceding calendar year. The CCPA definition of a “business” includes an entity that controls or is controlled by another business and shares common branding with that business. Common branding under CCPA means a shared name, servicemark, or trademark that the average consumer would understand that two or more entities are commonly owned.
Service provider agreements may require the service provider to assist the business with required cybersecurity audits, risk assessments, and to provide information to consumers about its automated decision-making technology. As of the current draft, the language may be included in the service provider agreement but is not required.
The cybersecurity auditor may be internal or external to the business, but must not participate in the business activities that the auditor may assess in current or future cybersecurity audits, such as developing procedures, preparing the business’s documents, or making recommendations for or implementing the business’s cybersecurity program.
The cybersecurity audit must assess how the business’s cybersecurity program protects personal information from unauthorized access, destruction, use, modification, or disclosure; and protects against unauthorized activity resulting in the loss of availability of personal information. The cybersecurity audit also may assess how the business’s cybersecurity program protects consumers from the negative impacts associated with unauthorized access, destruction, use, modification, or disclosure of personal information; and unauthorized activity resulting in the loss of availability of personal information. Those negative impacts to consumers include impairing consumers’ control over their personal information, as well as economic, physical, psychological, and reputational harm to consumers.
Revisions to the rules for clarity and readability are set to follow. The Board will then review the changes and provide a final draft, and finally proceed to formal rulemaking with a 45-day public comment period. Major changes are not expected in the final draft of the rules.
Revisions to General CCPA Regulations
In addition to the draft rules regarding AI and cybersecurity audits, the CPPA Board proposed revisions to the general CCPA regulations in connection with the final Board meeting of 2023. The first proposal was to add personal data of consumers under age 16 as a category of sensitive personal information – with the intention to harmonize the definition of sensitive personal information with that of other state laws (Connecticut, Delaware, Indiana, Iowa, Montana, Oregon, Tennessee, Texas, and Virginia). While these other states defined a “child” as a person under the age of 13, the selection of age 16 in this proposed revision is likely due to the fact that the CCPA already restricts the sale/sharing of the data of children under 16.
Another proposal was to modify compliance requirements for honoring consumer requests to delete. This would require businesses, service providers, and contractors to “implement measures to ensure that the information remains deleted, deidentified or aggregated” following the receipt of a deletion request from a consumer. This change is intended to account for the possibility that the consumer’s personal information may later be re-collected by the business, especially if the business obtains data from data brokers.
The Board also proposed raising the monetary threshold for CCPA applicability and potential fines based on the Consumer Price Index. The monetary threshold for the definition of “business” would be increased from $25,000,000 to $27,975,000. The potential fine amounts also would increase to $2,797.50 (up from $2,500) for each violation and $8,392.50 (up from $7,500) for each intentional violation.
Finally, the draft amendments contemplate adding opt-out preference signal disclosure, which would reinstate a requirement that businesses display whether they have processed a consumer’s universal opt-out preference signal. The requirement was included in an initial draft of regulations but made optional during revisions. Universal opt-out preference signals are user-enabled global privacy controls, such as a browser plug-in or privacy setting, device setting, or other mechanism, that communicate or signal the consumer’s choice to opt out of the sale of their personal information. Global Privacy Control is one of the more common preference signals. When a user who has enabled a preference signal visits a website that is able to receive preference signals, the preference signal will transmit the user’s opt-out request to the website.
Koley Jessen will continue to monitor developments related to the CPPA’s rulemaking and advise as updates become available. If you have questions as to how to ensure your use of AI complies with applicable law or how to prepare for a cybersecurity audit, please contact one of the specialists in our Data Privacy and Security or Artificial Intelligence practice areas.
*Special thanks to Data Privacy & Cybersecurity Support Specialist Briseyda Garcia-Ticas for her contributions to this article.
This content is made available for educational purposes only and to give you general information and a general understanding of the law, not to provide specific legal advice. By using this content, you understand there is no attorney-client relationship between you and the publisher. The content should not be used as a substitute for competent legal advice from a licensed professional attorney in your state.