Amazon argues it’s not liable for product that severely injured toddler
Amazon argued yesterday before the Texas Supreme Court that it should not be held liable for defective products sold through its site.
Over the years, several customers have been injured by defective products they purchased on the site from third-party sellers, including one woman whose eye was blinded by a defective dog leash and another who was burned by a laptop battery. The case currently before the Texas Supreme Court involves a 19-month-old toddler who suffered permanent damage to her esophagus when she ingested a lithium-ion battery that popped out of a knockoff remote control.
For years, Amazon has claimed that it is not liable in such cases since it functions as a middleman for sales made through its Marketplace platform.
The toddler’s mother purchased the knockoff Apple TV remote from a third-party seller called “USA Shopping.” After the battery burned her daughter, she sought to find out who was behind the Amazon storefront, discovering that it was run by one “Hu Xi Jie” out of Shenzhen, China. Neither the mother nor Amazon has been able to locate or contact Hu Xi Jie.
Lawyers for the child’s mother have argued that Amazon is liable for the defective remote since the site serves the same function as a physical retail store, which is to put products into the stream of commerce. Traditional brick-and-mortar stores are typically held liable for injuries caused by defective products if they don’t take adequate steps to keep them out of the hands of consumers, but courts have ruled that online marketplaces aren’t subject to the same rules since they don’t exercise the same level of control.
That may be beginning to change. Amazon’s “middleman” defense in product liability cases has worked in the past, but it’s looking increasingly thin. For example, last year, a California appellate court ruled that the company could be held liable in such cases, and the state’s supreme court declined to review it, effectively upholding the ruling.
During the hearing in Texas yesterday, Justice Debra Lehrman questioned Amazon’s claims that it is merely a “facilitator,” as the company’s attorneys claimed. “[Amazon] can basically be selling junk and have no responsibility to figure it out,” she said.
Amazon’s lawyer argued that vetting products in advance of listing them on the site was impossible given the scale of the problem. “Scale” is an increasingly common defense among Big Tech companies ranging from Amazon to Facebook, which have come under fire for their slipshod moderation practices. In Facebook’s case, the company has turned to artificial intelligence to tame the torrent of content that appears on the platform—with mixed success. Recently, the site’s own algorithms were found to be autogenerating pages for white supremacists, which Facebook has banned from the site.
Amazon’s scale problem is arguably harder to manage. A large portion of Amazon’s business is selling physical goods, which are practically impossible to vet using software alone. Thus, its strategy has been to react to problems as they arise rather than head them off in advance.
The plaintiff’s lawyer argued that Amazon’s reactive approach wasn’t sufficient. For one, sellers kicked off the platform can simply open a new storefront under a different name, creating a game of Whac-A-Mole. Then there’s also the question of the effectiveness of Amazon’s product safety teams, the lawyer told the court. “I deposed the product safety team and this woman, with all due respect, she was just asleep at the wheel,” he said.
Listing image by Ronny Hartmann/picture alliance | Getty Images
https://arstechnica.com/?p=1752415