Taking a new approach to draw attention to how Amazon uses people’s data, a shareholder is suing Jeff Bezos, Andy Jassy and 17 other Amazon executives who he says knowingly allowed the company to violate the state laws.
Amazon has previously come under fire for the way it uses biometric data, things like fingerprints and facial images. He was accused of collecting and using images of individuals without their consent as well as violating state laws that prohibit companies from profiting from individuals’ biometric data.
Usually the lawsuits are against the company. This time, shareholder Stephen Nelson’s lawsuit targets top Amazon decision makers, on behalf of the company itself.
The group of defendants – which includes leaders like founder and executive chairman Bezos; CEO Jassy; chief financial officer Brian Olsavsky; and General Counsel David Zapolsky, along with all 11 board members – knowingly allowed Amazon to make false statements about its use of biometric data, Nelson alleges in the lawsuit filed in US District Court in Seattle in april. Senior company officials, its lawyers say, “made a conscious choice to overlook Amazon’s conduct.”
Amazon executives and board members “caused substantial financial and reputational harm to Amazon,” the lawsuit says.
Amazon did not return requests for comment on the lawsuit.
Amazon, like many technology companies, uses biometric data to provide customers with the features they expect. Its virtual assistant Alexa uses voice recognition to answer user questions about the weather. A new feature for the Echo Show 15 device announced in September allows Alexa to use visual cues to identify an individual as they enter camera view and offers a to-do list, calendar and music selection personalized.
Amazon Web Services, the company’s cloud-computing arm, stores certain biometric data of its customers and those customers’ employees, according to the lawsuit, such as fingerprints to gain access to a building, voiceprints to identify callers and player face scans.
Amazon launched its facial recognition service, Rekognition, in 2016 for customers to embed “powerful visual search and discovery” into apps, according to a blog post on its website. Since then, it’s been used in Amazon’s smart home systems, Alexa and other cameras, and Amazon has marketed the technology to law enforcement and US Immigration and Customs Enforcement.
Rekognition has sounded the alarm for activists and shareholders, who say facial surveillance technology could “unfairly and disproportionately” target people of color. Amazon shareholders will vote on a proposal at the company’s annual meeting on May 25 to demand a report on how Rekognition is used and marketed, and how it might violate privacy and rights. civil rights.
In the summer of 2020, Amazon imposed a one-year moratorium on the sale and use of Rekognition by law enforcement.
In a statement urging shareholders to vote against the proposal, Amazon’s board said the company is “committed to the responsible use of our artificial intelligence and machine learning products and services.”
In response to growing concerns about how and where biometric data is used, Illinois passed legislation in 2008 to establish guidelines for how businesses and other entities could use an individual’s information. These guidelines require companies to obtain consent before collecting data and to tell individuals in writing what information is collected, why, and for how long.
Washington passed a similar law in 2017, and about 20 states now have safeguards in place.
Since the laws came into effect, tech companies such as Amazon, Microsoft, Google, TikTok and Meta have faced litigation for allegedly misusing individuals’ biometric data. This month, facial recognition startup Clearview AI agreed to restrict the use of its massive collection of face images following a two-year lawsuit alleging it had collected photos of people without their consent.
Amazon is facing at least 14 class action lawsuits and 75,000 individual cases, according to Nelson’s court filing.
With legal costs and potential fines to come, Nelson argues Amazon’s executives and board are responsible for the consequences of these lawsuits. The potential damages are “astronomical to the point that the company could be put out of business if the violations are not immediately addressed, stopped and corrected,” the lawsuit states.
Nelson accuses Amazon policymakers of misleading investors about the financial and legal risks associated with its use of biometric data and potential violations of Illinois’ biometric information privacy law. Facing its own class action lawsuit, Facebook agreed to pay $650 million in February 2021 for the company’s use of facial recognition technology.
In legal documents, Amazon’s board dismissed shareholders’ concerns about risks, alleged by the lawsuit, and defendants listed in the lawsuit signed ‘false statements’ about Amazon’s compliance with laws of State.
The defendants’ conduct “jeopardizes and harms one of Amazon’s most important (and fragile) assets: consumer confidence,” the lawsuit says. “Reputational damage is particularly devastating for tech companies like Amazon.”
Nelson declined to comment through his attorney, Gregory Wesner of Bainbridge Island-based Herman Jones.
The lawsuit asks the court to order Amazon to change its biometric data practices, as well as how it governs itself internally.
He suggests appointing board members with training in cybersecurity and consumer privacy, reviewing the company’s policies on “confidential reporting” and refining its investigation process for complaints from from Amazon.