- 1798.100 – Consumers right to receive information on privacy practices and access information
- 1798.105 – Consumers right to deletion
- 1798.110 – Information required to be provided as part of an access request
- 1798.115 – Consumers right to receive information about onward disclosures
- 1798.120 – Consumer right to prohibit the sale of their information
- 1798.125 – Price discrimination based upon the exercise of the opt-out right
If a business receives a right to be forgotten request from an employee, or a former employee, does it have to delete the requestor’s information?
Not necessarily.
As an initial matter, employees that are residents of California will not qualify as full “consumers” under the law until January 1, 2021. Pursuant to an amendment to the CCPA enacted in 2019, the title shall not apply to “[p]ersonal information that is collected by a business about a natural person in the course of the natural person acting as a job applicant to, an employee of, owner of, director of, officer of, medical staff member of, or contractor of that business to the extent that the natural person’s personal information is collected and used by the business solely within the context of the natural person’s role or former role as a job applicant to, an employee of, owner of, director of, officer of, medical staff member of, or a contractor of that business.”1 As of the date of this writing, this provision will expire on January 1, 2021, and employees will be considered full “consumers” under CCPA on that date.
That said, assuming that employees are consumers, there are a number of exceptions to the consumer’s right to deletion that may be applications. Specifically, the business may argue that the employee’s request for deletion cannot be granted based on one or more statutory exceptions outlined above. In particular, the business may argue that it has a legal obligation to retain the data, and that the data is required to carry out a transaction with the employee.2 This list is by no means exhaustive. Finally, it should be noted that even apart from the specific exceptions to the consumer’s right to deletion articulated in section 1798.105 of CCPA, the business also is not required to take any action that would violate other state or federal obligations imposed upon it, including federal employment laws.3
Under US law, can an employer share with public health authorities the names of employees infected with a contagious disease?
- The CCPA requires that a business include within its notice of collection and/or privacy notice a general disclosure that informs employees of the business purposes for which their information was collected. While it is not certain whether disclosure to a public health authority would be considered a “business purpose,” businesses should consider stating within their privacy notices that information may be shared with federal, state, or local government agencies for the purpose of protecting employees, protecting the public, or protecting other individuals.2
- In the event that an employee submits an access request upon the business, the CCPA requires (beginning on January 1, 2021) that the business state what information was “disclosed for a business purpose.”3 While it is not certain whether disclosure to a public health authority would be considered a “business purpose,” businesses should consider stating in response to an access request that information was shared with a government agency and identifying the categories of information that were shared.4
It is important to note that other federal or state labor and employment laws likely preclude a business from sharing information about potentially contagious employees with public health authorities. For example, the federal Americans with Disabilities Act requires that any information which is obtained as part of a voluntary medical examination, or as part of voluntarily collecting medical information from an employee, be kept “confidential.”5 Although this confidentiality requirement is subject to certain exceptions, the only government-related exception permits disclosure upon request to “government official investigating compliance with [the ADA].”6 Thus the ADA may prohibit a business from voluntarily disclosing information about an infected employee to state or local public health agencies. As a practical matter, most infectious diseases are identified by medical providers who may have an independent obligation to report the infection to public health authorities (e.g., the Center for Disease Control). As a result, public health authorities should not be reliant upon a company to provide information about infected individuals.
Is it possible for data that has undergone salted-hashing to still be considered “personal information?”
Maybe.
“Salting” refers to the insertion of a random value (e.g., a number or a letter) into personal data before that data is hashed.
Whether personal information that has undergone salting and hashing is still considered “personal information” depends upon the particular law or regulation at issue.
In the context of the CCPA, information is not “personal information” if it has been “deidentified.”1 Deidentification means that the data “cannot reasonable identify, relate to, describe, be capable of being associated with, or be linked, directly or indirectly, to a particular consumer.”2 A strong argument could be made that data that is salted and then hashed cannot reasonably be associated with an individual. That argument is strengthened under the CCPA if a business takes the following four steps to help ensure that the salted and hashed data will not be re-identified:3
- Implement technical safeguard that prohibit reidentification. Technical safeguards may include the process or techniques by which data has been deidentified. For example, this might include the hashing algorithm being used or the number of characters inserted as part of the salting process.4
- Implement business processes that specifically prohibit reidentification. This might include an internal policy or procedure that prevents employees or vendors from attempting to reidentify data or reverse the salted and hashed values.
- Implement business processes to prevent inadvertent release of deidentified information. This might include a policy against disclosing hashed values to the public.
- Make no attempt to reidentify the information. As a functional matter, this entails taking steps to prohibit reidentification by the business’s employees.
In comparison, in the context of the European GDPR the Article 29 Working Party5 has stated that while the technique of salting and then hashing data “reduce[s] the likelihood of deriving the input value,” because “calculating the original attribute value hidden behind the result of a salted hash function may still be feasible within reasonable means,” the salted-hashed output should be considered pseudonymized data that remains subject to the GDPR.6
For more information and resources about the CCPA visit http://www.CCPA-info.com.
This article is part of a multi-part series published by BCLP to help companies understand and implement the General Data Protection Regulation, the California Consumer Privacy Act and other privacy statutes. You can find more information on the CCPA in BCLP’s California Consumer Privacy Act Practical Guide, and more information about the GDPR in the American Bar Association’s The EU GDPR: Answers to the Most Frequently Asked Questions.
1. CCPA, Section 1798.145(a)(5).
2. CCPA, Section 1798.140(h).
3. CCPA, Section 1798.140(v).
4. Salting refers to the insertion of characters into data before it is hashed to make brute force reidentification more difficult.
5. The Article 29 Working Party was the predecessor to the European Data Protection Board.
6. Article 29 Working Party, WP 216: Opinion 05/2014 on Anonymisation Techniques at 20 (adopted 10 April 2014).
Is it possible for a token to still be considered “personal information?”
Maybe.
“Tokenization” refers to the process by which you replace one value (e.g., a credit card number) with another value that would have “reduced usefulness” for an unauthorized party (e.g., a random value used to replace the credit card number).1 In some instances, tokens are created through the use of algorithms, such as hashing techniques.
Whether personal information that has been tokenized is still considered “personal information” depends upon the particular law or regulation at issue.
In the context of the CCPA, information is not “personal information” if it has been “deidentified.”2 Deidentification means that the data “cannot reasonable identify, relate to, describe, be capable of being associated with, or be linked, directly or indirectly, to a particular consumer.”3 A strong argument could be made that data that is fully tokenized, and no longer is connected to a particular consumer, cannot reasonably be associated with an individual. That argument is strengthened under the CCPA if a business takes the following four steps to help ensure that the tokenized data will not be re-identified:4
- Implement technical safeguards that prohibit reidentification. Technical safeguards may include the process, or techniques, by which tokens are assigned. For example, a business might take steps to randomly generate tokens, or ensure that tokens are not assigned sequentially in a manner that might allow a third party to guess to whom the token relates.
- Implement business processes that specifically prohibit reidentification. This might include an internal policy or procedure that separates tokens from any “key” that might allow an individual to match a token to a consumer.
- Implement business processes to prevent inadvertent release of deidentified information. This might include a policy against disclosing information about individuals even if the names of the individuals have been replaced with tokens.
- Make no attempt to reidentify the information. As a functional matter, this entails taking steps to prohibit reidentification by the business’s employees.
In comparison, in the context of the European GDPR, the Article 29 Working Party5 has stated that even when a token is created by choosing a random number (i.e., it is not derived using an algorithm), the resulting token typically does not make it impossible to re-identify the data and, as a result, the token is best described as “pseudonymized” data which would still be “personal data” subject to the GDPR.6
Is it possible for data that has undergone hashing to still be considered “personal information?”
Maybe.
Hashing refers to the process of using an algorithm to transform data of any size into a unique fixed sized output (e.g., combination of numbers). To put it in layman’s term, some piece of information (e.g., a name) is run through an equation that creates a unique string of characters. Anytime the exact same name is run through the equation, the same unique string of characters will be created. If a different name (or even the same name spelled differently) is run through the equation, an entirely different string of characters will emerge.
While the output of a hash cannot be immediately reversed to “decode” the information, if the range of input values that were submitted into the hash algorithm are known, they can be replayed through the hash algorithm until there is a matching output. The matching output would then confirm, or indicate, what the initial input had been. For instance, if a Social Security Number was hashed, the number might be reverse engineered by hashing all possible Social Security Numbers and comparing the resulting values. When a match is found, someone would know what the initial Social Security Number that created the hash string was. The net result is that while hash functions are designed to mask personal data, they can be subject to brute force attacks.
Whether a hash value in and of itself is considered “personal information” depends upon the particular law or regulation at issue.
In the context of the CCPA, information is not “personal information” if it has been “deidentified.”1 Deidentification means that the data “cannot reasonable identify, relate to, describe, be capable of being associated with, or be linked, directly or indirectly, to a particular consumer.”2 An argument could be made that data once hashed cannot reasonably be associated with an individual. That argument is strengthened under the CCPA if a business takes the following four steps to help ensure that the hashed data will not be re-identified:3
- Implement technical safeguard that prohibit reidentification. Technical safeguards may include the process or techniques by which data has been deidentified. For example, this might include the hashing algorithm being used, or combining the hashed algorithm with other techniques that are designed to further obfuscate information (e.g., salting).4
- Implement business processes that specifically prohibit reidentification. This might include an internal policy or procedure that prevents employees or vendors from attempting to reidentify data or reverse hashed values.
- Implement business processes to prevent inadvertent release of deidentified information. This might include a policy against disclosing hashed values to the public.
- Make no attempt to reidentify the information. As a functional matter, this entails taking steps to prohibit reidentification by the business’s employees.
In comparison, in the context of the European GDPR, the Article 29 Working Party5 considered hashing to be a technique for pseudonymization that “reduces the linkability of a dataset with the original identity of a data subject” and thus “is a useful security measure,” but is “not a method of anonymisation.6 In other words, from the perspective of the Article 29 Working Party while hashing might be a useful security technique it was not sufficient to convert “personal data” into deidentified data.
CCPA Privacy FAQs: Can a company decide whether to deidentify information or delete information if it receives a ‘right to be forgotten’ request?
Yes.
The CCPA states that people have a right to request that a business “delete any personal information about the consumer which the business has collected from the consumer.”1 Although the CCPA does not define what it means to “delete” information or specify how a business must carry out a deletion request, California courts are likely to accept at least two approaches to deletion.
First, a business that receives a deletion request may choose to erase, shred, or irrevocably destroy the entirety of a record that contains personal information. As part of that destruction, any personal information contained within the record will, necessarily, be “deleted.”
Second, California courts are likely to accept the anonymization or de-identification of information as a form deletion. Among other things, a separate California statute (the “California data destruction statute”), which predates the CCPA, requires that businesses take “reasonable steps” to dispose of customer records that “contain[] personal information.”2 That statute recognizes that a customer record can be “dispos[ed]” of without its complete erasure by “modifying the personal information within the record to make it unreadable or undecipherable through any means.”3 As a result, if a business maintains a record, but modifies the portion of the record that contains “personal information” (e.g., deletes, redacts, replaces, or anonymizes name, address, Social Security Number, etc.), its actions conform to the California data destruction statute. A strong argument can be made that a business that complies with the destruction standard under the California data destruction statute should be deemed to be in compliance with the deletion requirements of the CCPA, and, as a result, the removal of the portion of a record that contains personal information is sufficient to “delete” such information. This approach is further supported by the fact that the CCPA expressly states that it does not impose any restriction on a business that “retain[s]” information that is “deidentified.”4 As a result, if a business de-identifies a record by modifying the personal information within it such that the personal information is no longer associated with an identified individual, the further retention of the record (i.e., the record absent the personal information) is not prohibited by the CCPA.5
It is worth noting that the use of de-identification or anonymization techniques to remove personal information from a record is also consistent with other California consumer protection statutes. Specifically, in 2015, California enacted a statute that required operators of websites and mobile apps directed towards minors to “remove” content that a minor posted on a website if requested (the California “Erasure Button Law”).6 The Erasure Button Law specifically states that a company is not required to “erase or otherwise eliminate” such information if “the operator anonymizes the content or information” such that it “cannot be individually identified.”7
CCPA Privacy FAQs: If a company experiences a data security breach, and receives a “Right to be Forgotten” request from a data subject whose information was involved, does the company have to delete the information that they have about the individual?
Typically not.
When investigating a data security incident, companies are often focused on determining whether there has been unauthorized access or acquisition to personal data, and, if so, which data subjects were impacted. As part of that investigation, companies typically create records that indicate which data subjects were, or were not, impacted by the incident, and attempt to create copies of the records that might have been impacted.
If a company notifies individuals about a data breach, it is not uncommon for some portion of the notified individuals to request that the company delete the information held about them. Such requests raise an inherent conflict. On the one hand, the data subject may no longer wish their information to be in the hands of the company – particularly if they perceive that the company’s security may be inadequate or may have contributed to the data breach. On the other hand, the company has a strong interest in maintaining records relating to the security incident. For example, if a data subject were to bring an action against a company for damages as a result of the security incident, the company has an interest in being able to refer to its internal records to determine if the data subject’s information was involved in the incident, and, if so, what types of data fields may have been impacted. Similarly, if a third party is responsible for a data breach, a company may need the evidence (in an unaltered and authenticated state) in order to bring suit against the third party, or to aid in a criminal prosecution against the individual. The GDPR resolves the conflict by allowing a company to keep personal data – despite a data subject’s request that it be deleted – if data is “necessary . . . for the establishment, exercise or defence of legal claims.”1
In some circumstances, data relating to a breach may no longer be necessary for the purpose of establishing a claim or defense (e.g., if the attacker has already been prosecuted, or the statute of limitations for a third party to bring a claim relating to the incident has expired). In such situations, whether a company must comply with a deletion request depends on the context of a particular incident and whether one of the following criteria applies:
- Companies must delete data upon request if data is no longer necessary. If the personal data that was collected by a company about an individual is “no longer necessary in relation to the purposes for which [it was] collected,” the company typically must honor a right to be forgotten request.2 As a result, if the company no longer needs the data to establish a legal defense or claim, and the data is no longer necessary for the purposes of its original collection, the request to delete should be honored.
- Companies must delete data upon request if the data was processed based solely on consent. If a company’s sole basis for processing data was the consent of the individual, the company is typically required to honor a right to be forgotten request, which might for all practical purposes be viewed as a revocation of that consent. Conversely, if processing is based on an additional permissible purpose (g., performance of a contract) the right to be forgotten request does not necessarily have to be granted.
- Companies must delete data upon request if the data was processed based upon the controller’s legitimate interest, and that interest is outweighed by the data subject’s rights. When processing is based upon a company’s legitimate interest, a data subject has a right to request deletion unless the interest of a controller or a third party is demonstrably “overriding.”3 Whether or not the company’s interest in continuing to keep the information, or the data subject’s interest in having it deleted, control may depend on the precise reasons both parties have for keeping (or deleting) the information.
Like the GDPR, the CCPA contains an exception that permits a company to refuse a deletion request if the information is needed to “[e]xercise or defend legal claims.”4 The CCPA also contains an exception that permits the retention of the information if it is “necessary” to “prosecute those responsible for” a security incident,5 if it is needed for “internal uses that are reasonably aligned with the expectations of the consumer,”6 or if it is necessary for the business to use it internally in a manner that is “compatible with the context in which the consumer provided the information.”7