The kind of data picked up by AI-assisted security cameras isn’t the same as the data covered by biometric privacy laws, which a growing number of states have adopted, but out of an abundance of caution building operators should notify people if they’re capturing people’s images, a plaintiff’s attorney says.
Illinois, Washington and Texas are among states that require organizations to get approval before they can capture people’s biometric information, a growing practice among organizations that want to track employees or enhance security.
Biometric data refers to fingerprints, retina scans, facial scans and other collections of data that are unique to each person. Some organizations are using fingerprint scans in place of time cards and some are using retina and facial scans for building access.
Illinois has the first, and most strict, standalone biometric data privacy law on the books. It requires any organization collecting someone’s data to get written permission. Other states have followed with standalone laws or have added biometric data to their privacy laws.
Although the data building operators collect using AI-assisted security cameras isn’t biometric, there’s enough ambiguity in the definition that it’s a good practice to let people know if you’re capturing their images on camera, says Jeremy Gottschalk, a plaintiffs lawyer and CEO of Marketplace Risk, an advisement firm.
“With AI-assisted cameras, you’re on the edge, because you’re using physical attributes even though it’s not actual biometrics,” Gottschalk said in an interview. “If the state has a biometric law, [as a plaintiff’s lawyer] I’m not going to summarily walk away from a case like that. I would argue you’ve created these images based on this person’s attributes. You’re trying to trigger some action when someone [is captured by] the cameras. So, I’m certainly going to argue that that’s my client’s biometric information.”
Under Illinois' Biometric Information Privacy Act, private entities face penalties of $1,000 for each negligent violation or $5,000 for each intentional or reckless violation.
In a typical use of AI-assisted security cameras, a facility will capture and store the image of someone who is on a watch list because of a prior incident and send an alert if the person is seen on the premises again.
As a best practice, Gottschalk recommends building operators put up clear and conspicuous notices that the facility is using AI-assisted cameras to capture data for security purposes.
“Everyone loves to feel safe,” he said. “You can notify people and position it in a way that’s positive.”
In case of a lawsuit, the organization can point to the notices as part of its defense.
“I would err on the side of more notification, because if you do get sued, at least you’ll have that,” Gottschalk said. “‘We provided a clear, conspicuous notice at the point of entry.’ I don’t think you’re going to be nailed for not being detailed enough. [Most state] privacy laws are mainly disclosure laws. They don’t even require authorization; you’re just required to disclose. So, I would provide notice, even if it’s generic.”
In typical state data privacy laws, separate from whether the data is considered biometric, there are data handling requirements. This could impact how organizations store images and how they can be accessed, either by request or if there’s a breach.
“Almost all states now have breach remediation rules and in some cases there are penalties,” he said. “So, if there’s a breach, the cost to the building can grow quickly just by virtue of complying with state breach notification laws. [The data] just becomes an inherent risk over a really long tail.”
Depending on the state law, even without a breach, if a person believes an organization has data on them, they can request access to that data and have it removed in certain cases. Under the California Consumer Privacy Act, for example, consumers have a right to know about the personal information an organization collects about them and how it’s used and shared. In some cases, a person can order the organization to remove it.
Bottom line, even though the data AI-assisted security cameras collect isn’t biometric, letting people know you’re collecting data is a good practice from a risk management standpoint, especially if you’re in a state with robust data privacy protections and breach remediation requirements.
“Notification goes a long way in data privacy laws,” Gottschalk said.