Justice Department sends its Section 230 rewrite to Congress

Cartoon hands hold out a band-aid over the words Section 230.

The Department of Justice today dropped a proposed “recalibration” of one of the most important laws governing the US Internet into Congress’ lap and urged legislators to act to remove a liability protection on which nearly every website and app currently relies.

Attorney General Bill Barr sent the proposed legislation—an extension of his June wish list—to Speaker of the House Nancy Pelosi and Vice President Mike Pence (in his role as president of the Senate) this morning.

“For too long Section 230 has provided a shield for online platforms to operate with impunity,” Barr said in a written statement. “Ensuring that the Internet is a safe, but also vibrant, open, and competitive environment is vitally important to America,” he added. “We therefore urge Congress to make these necessary reforms to Section 230 and begin to hold online platforms accountable both when they unlawfully censor speech and when they knowingly facilitate criminal activity online.”

What is Section 230?

“Section 230” is the nickname for a very short part of the law under which Internet companies have been regulated since 1996. The key part of § 230 of the Communications Decency Act currently reads:

No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.

Those 26 words are included under the subheading, “Protection for ‘Good Samaritan’ blocking and screening of offensive material,” which allows Internet platforms to moderate user content shared on those platforms more or less however they wish—heavily, moderately, or not at all. Broadly speaking, it means that if you use an Internet service to say something obscene or unlawful, then you, not the Internet service, are the one responsible for having said the thing, and the Internet service has legal immunity from whatever you said. (That’s the short version. For the long version, give Ars’ own Tim Lee’s explanation of the law and its history a read.)

What does the DOJ propose changing?

While the proposal (PDF) leaves the core principle of Section 230 unaltered, it immediately applies a big honking qualifier.

That qualifier says that the existing law won’t apply “to any decision, agreement, or action by a provider or user” of an Internet service “to restrict access to or availability of material provided by another information content provider.” That seems to mean that a service such as Twitter or Facebook would not be permitted to block users from sharing URLs to, for example, hate sites.

The proposed new bill legislation also adds several explicit carve-outs to the law. The first is a “bad Samaritan” carve-out, exempting a platform from qualifying for section 230 immunity if it “acted purposefully with the conscious object to promote, solicit, or facilitate material or activity… that the service provider knew or had reason to believe would violate Federal criminal law.”

“Criminal material” also has an explicit carve-out that would require platforms to remove or restrict access to that material and report it to law enforcement as soon as they received notice of it—a mechanism seemingly inspired by the way copyright takedown requests work today.

The last carve-out applies to “judicial decisions.” If a court rules that any content posted to a service is “Defamatory under state law or unlawful in any respect,” platforms would have to remove that content as well.

The last page of the proposed legislation adds a list of “good faith” requirements. In order to be considered to be acting in good faith, a platform must have a publicly available terms of service that includes its moderation rules; must moderate content “consistent with” those terms of use; must not block content on “deceptive or pretextual grounds”; and must supply the entity being moderated with “reasonable factual basis for the restriction of access and a meaningful opportunity to respond.”

“A platform that chooses not to host certain types of content would not be required to do so,” is how Barr described it to Pelosi and Pence, “but it must act in good faith and abide by its own terms of service and public representations. Platforms that fail to do those things should not enjoy the benefits of Section 230 immunity.”

Companies are already required to adhere to the terms of service they provide to customers, for what it’s worth. The FTC is responsible for enforcing consequences against companies it finds to have used misleading or deceptive terms of service, including, in 2011, both Facebook and Twitter.

Why now?

Barr’s line about platforms “unlawfully censoring” speech is the true crux of the issue: the Trump administration has for several years claimed that social media platforms such as Facebook and Twitter have an anti-conservative bias and unlawfully stifle right-wing content. The administration has been seeking ways to alleviate the alleged “bias” by changing the laws that apply to the social media sector, including Section 230.

These assertions of bias are not supported by any evidence to date. A cursory look at the evidence, in fact, tends to support the opposite conclusion. President Donald Trump himself has more than 86 million followers on Twitter—not the most-followed account, but comfortably within the top 10. Over on Facebook, the top 10 highest-traffic posts on any given day are almost all from conservative sources. Facebook itself tried to identify anti-conservative bias on its platform but did not find any. Stories from several organizations, including a recent Bloomberg cover story, find exactly the opposite, reporting that Facebook has instead worked to accommodate the Trump campaign and conservative outlets.

Nonetheless, Trump signed an executive order in May targeting online platforms. That order has a three-pronged approach, asking the Federal Communications Commission, Federal Trade Commission, and Department of Justice each to undertake some part of the enforcement.

Is the proposed law… legal?

If Congress were to pass the law, it would almost certainly face near-immediate legal challenge.

The proposal is not far off the FCC’s “must-carry” rules for broadcasters, said Gautam Hans, a professor of law at Vanderbilt University and director of the Stanton Foundation First Amendment Clinic.

“My biggest concern is that it seems to be proscribing the platform’s ability to take down content,” Hans added. “In effect, this is attempting to create ‘must carry’ obligations for certain kinds of speech.” Hans added that the proposal amounts to “hypocrisy” on the part of conservatives, as compared to the Fairness Doctrine getting axed during the Reagan administration.

“There is much to criticize about Section 230, how platforms have responded to its framework, and the way it has played out in the courts—but this is hardly the path forward we should take,” Hans concluded. “I suspect that the platforms would have some good First Amendment arguments if this legislation went into effect.”

Consumer-advocacy group Free Press also had a withering take on the proposal. “The draft legislation from the Department of Justice is a naked attempt to silence anyone who attempts to correct or criticize Trump, his allies, and fellow travelers,” Free Press senior policy counsel Gaurav Laroia said. “It would incentivize platforms to host content created by racists, sexists, propagandists, and trolls. This legislation would also open the door to those who want to drown the Internet in an ocean of disinformation and toxicity.”

Aside from the content, the proposed recalibration also may not be a particularly well-crafted piece of legislation. Blake Reid, a professor of technology and telecom law at the University of Colorado at Boulder observed in a series of tweets that the proposal, as structured, doesn’t particularly hold together as a legal argument. “I have no idea how courts would interpret this because it just doesn’t make any sense,” Reid wrote. “This is not a long statute. But after trying to parse this proposal for half an hour I’m left with major questions about what it’s even trying to accomplish.”

Still just a wish list

Now, the House and Senate can take the draft legislation under consideration and probably do exactly nothing with it, just as Congress has done with the last several tech-related bills. This particular proposed bill has the weight of the White House behind it, but the chances of it becoming law at this time are slim at best.

The administration’s proposed text is, in fact, similar to a bill that Sens. Roger Wicker (R-Miss.), Lindsey Graham (R-S.C.), and Martha Blackburn (R-Tenn.) introduced earlier this month in the Senate. Their version (PDF), called the Online Freedom and Viewpoint Diversity Act, is shorter but more directly targets Internet companies’ ability to moderate content on their platforms.

“The contentious nature of current conversations provides perverse incentive for these companies to manipulate the online experience in favor of the loudest voices in the room,” Blackburn said about that bill at the time. “There exists no meaningful alternative to these powerful platforms, which means there will be no accountability for the devastating effects of this ingrained ideological bias until Congress steps in and brings liability protections into the modern era.”

That bill is now sitting with the Senate Commerce Committee, where it is likely to languish until the current session of Congress expires in January. That, too, is almost certainly the fate of Barr’s proposal. As little appetite as both the House and Senate seemed to have for moving on tech legislation in 2019, the extraordinary maelstrom of circumstances created in 2020 by the confluence of the COVID-19 pandemic, the impending presidential election, and the recent passing of Supreme Court Justice Ruth Bader Ginsburg make lawmakers even less likely to take up any such bill in the immediate term.

That said, however: Section 230 is now almost 25 years old, and the Internet has changed significantly since its adoption. The DOJ’s proposal is not the first attempt to amend it, and it will not be the last.

https://arstechnica.com/?p=1709042




Senate takes another stab at privacy law with proposed COPRA bill

A serious woman listens during a hearing.
Enlarge / Sen. Maria Cantwell during a Senate Finance Committee hearing on Aug. 22, 2018.

Perhaps the third time’s the charm: a group of Senate Democrats, following in the recent footsteps of their colleagues in both chambers, has introduced a bill that would impose sweeping reforms to the current disaster patchwork of US privacy law.

The bill (PDF), dubbed the Consumer Online Privacy Rights Act (COPRA), seeks to provide US consumers with a blanket set of privacy rights. The scope and goal of COPRA are in the same vein as Europe’s General Data Protection Regulation (GDPR), which went into effect in May 2018.

Privacy rights “should be like your Miranda rights—clear as a bell as to what they are and what constitutes a violation,” Sen. Maria Cantwell (D-Wash.), who introduced the bill, said in a statement. Senators Amy Klobuchar (D-Minn.), Ed Markey (D-Mass.), and Brian Schatz (D-Hawaii) also co-sponsored the bill.

The press release announcing the bill also includes statements of support from several consumer and privacy advocacy groups, such as Consumer Reports, the Electronic Privacy Information Center (EPIC), the Georgetown Law Center on Privacy & Technology, and the NAACP.

What’s in the bill?

The proposals within COPRA fall basically into three main buckets: enumerated rights for consumers, data-handling requirements for businesses, and enforcement mechanisms.

As explained in a one-page summary of the bill (PDF), the rights consumers would gain from COPRA include:

  • The right to be free from deceptive and harmful data practices; financial, physical, and reputational injury, and acts that a reasonable person would find intrusive, among others
  • The right to access their data and greater transparency, which means consumers have detailed and clear information on how their data is used and shared
  • The right to control the movement of their data, which gives consumers the ability to prevent data from being distributed to unknown third parties
  • The right to delete or correct their data
  • The right to take their data to a competing product or service

On the company side, businesses would be required to demonstrate that they take “preventive and corrective actions” to protect consumer data from leaks, breaches, hacks, or other kinds of misappropriation. Highly sensitive data, such as biometric data and geolocation data, would also be subject to stronger standards for protection and use.

The bill would put responsibility for enforcement in the hands of the Federal Trade Commission, which would also be tasked with creating specific new rules detailing the processes covered entities would be required to follow.

COPRA also seems to take the challenges the EU and consumers have faced since the GDPR went into effect into account, as it specifically tasks the FTC with making sure those rules not only require “clear and conspicuous” notices to opt in or opt out of data collection and transfers but also “to minimize the number of opt-out designations of a similar type that a consumer must make” (such as an “accept cookies” warning on every single website one visits).

https://arstechnica.com/?p=1624123




FTC head asks Congress for real privacy laws he can enforce

A man in a suit speaks into a microphone.
Enlarge / FTC Chairman Joseph Simons testifying about antitrust matters before a Senate committee in September 2019.

In testimony before a House subcommittee Wednesday, Federal Trade Commission Chairman Joseph Simons renewed his call for Congress to pass new privacy legislation, telling representatives, essentially, he can’t enforce a law that doesn’t exist.

Simons was on Capitol Hill testifying in a hearing on “Online Platforms and Market Power,” the probe the House Antitrust subcommittee launched in June to dig into Apple, Amazon, Facebook, and Google.

At the highest level, the FTC is responsible for basically two things: protecting competition and protecting consumers. To that end, it’s one of the two bodies with antitrust oversight, sharing responsibility with the Justice Department for reviewing mergers and challenging anticompetitive behavior.

As part of the consumer protection half of its mandate, the FTC also regulates the gulf between what companies say they will do and what they actually do. As such, the agency has become the closest thing the United States has to a privacy regulator. But its actions and authority are limited under the law, and when it comes to privacy, the commission can more or less only intervene when a prior agreement has been broken—it can’t just impose fines or other penalties for behavior that feels like it should be illegal but isn’t.

During the hearing, Rep. Joe Neguse (D-Colo.) pressed Simons on the FTC’s track record with privacy enforcement. “I want to talk briefly about the settlement with Facebook earlier this year,” Neguse said, asking Simons to explain the “methodology” the commission used to reach the $5 billion agreement.

“I’m very disappointed that you all are disappointed” in the settlement, Simons replied, defending the settlement as “very aggressive.” But, he added: “Even if we wanted to do more, we don’t have the authority to do more. We do not have the authority to impose fines or, on our own, increase the injunctive relief. We have to go to court. So what we did is we negotiated very long and hard with Facebook to get the best relief we could in a settlement and then compare that to what we would have gotten if we had gone to court.”

Neguse cut Simons off there, but he added, “Ultimately, one point that I think we both agree on is that the tools that the FTC has under existing statute, in my view, could be strengthened, given the trend and the challenges that your agency faces.” Simons agreed.

Later in the hearing, Simons reiterated the sentiment, telling the chamber, “I think if you want us to do more on the privacy front, then we need help from you,” Simons said. “We’ve done as much as we can do with the tools we have.”

A frequent refrain

Simons has rung the alarm about this particular enforcement gap several times already this year. The FTC in July announced not only the Facebook settlement but also a $575 million settlement with Equifax over its massive consumer data breach. With each announcement, Simons called on Congress to act.

“The CFPB and the states were able to obtain civil penalties for this massive breach by a major financial institution. The FTC could not,” Simons said in a press conference about the Equifax case. “Fortunately other agencies were able to fill in the gap this time. That will not always be the case, which sends the wrong signal regarding deterrence.”

During a separate press event about Facebook a week later, he echoed the sentiment, saying “We are a law enforcement agency without the authority to promulgate general privacy regulations. Our authority in this case comes from a 100-year-old statute that was never intended to deal with privacy issues like the ones that we address today.”

Members of Congress likewise have been calling on their colleagues to do something about federal privacy laws. Sen. Ron Wyden (D-Ore.) last month proposed a bill that would not only drastically expand the FTC’s scope and resources but also would impose criminal penalties for company officers found responsible for mishandling consumer data.

The House has likewise taken up the call: this month, Reps. Anna Eshoo and Zoe Lofgren, both Democrats from California, introduced an Online Privacy Act earlier this month. Their bill would not only create new standards but also take their enforcement out of the hands of the FTC altogether, instead creating a Digital Privacy Agency to handle it.

In the meantime, regulators have been doing the best they can to consider whether matters of consumer data and privacy fall under antitrust law. DOJ antitrust chief Makan Delrahim has several times discussed ways in which poor privacy controls that harm consumers may be considered a result of poor competition.

“As a foundational matter, we must acknowledge that data has economic value, and some observers have said it is analogous to a new currency,” Delrahim said in a speech last week:

Like other features that make a service appealing to a particular consumer, privacy is an important dimension of quality. For example, robust competition can spur companies to offer more or better privacy protections. Without competition, a dominant firm can more easily reduce quality—such as by decreasing privacy protections—without losing a significant number of users.

As I have said before, these non-price dimensions of competition deserve our attention and renewed focus in the digital marketplace.

https://arstechnica.com/?p=1602063




New bill would create Digital Privacy Agency to enforce privacy rights

A serious woman in a business suit listens at a podium.
Enlarge / Chairwoman Anna Eshoo, D-Calif., conducts a House Energy and Commerce Subcommittee on Health markup on Thursday, July 11, 2019.

Congress is taking yet another stab at addressing the near-complete lack of federal laws covering the absolutely massive trove of data that companies now collect on every one of us, which forms the backbone of basically the entire big tech era.

Representatives Anna Eshoo and Zoe Lofgren, both Democrats from California, introduced the Online Privacy Act today. The act would create a new federal agency, the Digital Privacy Agency, to enforce privacy rights. The act would also authorize the agency to hire up to 1,600 employees.

“Every American is vulnerable to privacy violations with few tools to defend themselves. Too often, our private information online is stolen, abused, used for profit, or grossly mishandled,” Eshoo said in a statement. “Our legislation ensures that every American has control over their own data, companies are held accountable, and the government provides tough but fair oversight.”

“Our country urgently needs a legal framework to protect consumers from the ever-growing data-collection and data-sharing industries that make billions annually off Americans’ personal information,” Rep. Lofgren added. “Privacy for online consumers has been nonexistent—and we need to give users control of their personal data by making legitimate changes to business practices.”

The Online Privacy Act

The provisions in the bill (PDF) would apply to “any entity (including nonprofits and common carriers) that intentionally collects, processes, or maintains personal information AND transmits personal information over an electronic network.”

Under the terms of the OPA, individuals would have the right to obtain, correct, and delete data collected about them by covered entities, as well as to request “a human review” of automated decisions. Users would also have to opt-in to having their personal data used for training machine learning algorithms. They would be able to choose for how long companies retain their data.

The bill distinguishes between aggregated data and personal, identifiable data that is tied to an individual, and it places strong limitations on use of the latter. As outlined in a one-page fact sheet, the OPA would:

  • articulate the need for and minimize the user data [covered entities] collect, process, disclose, and maintain
  • minimize employee and contractor access to user data
  • not disclose or sell personal information without explicit consent
  • not use third-party data to reidentify individuals
  • not use private communications, (e.g., emails and Web traffic) for ads or other invasive purposes
  • not process data in a way that violates civil rights, e.g., employment discrimination
  • only process genetic information in limited circumstances
  • use objectively understandable privacy policies and consent processes, and may not use ‘dark patterns’ to obtain consent
  • employ reasonable cybersecurity policies to protect user data, and
  • notify the agency and users of breaches and data-sharing abuses, e.g., Cambridge Analytica

The privacy mess

Privacy law in the United States today is a patchwork of regulation, and the end result is basically a hot mess that leaves agencies with limited authority to investigate and penalize even egregious abuses of personal data.

The federal statutes that exist each cover a specific, limited kind of data and enumerate a specific, limited kind of entity that’s obligated to protect that data. So for example, while your doctor’s office can’t sell information about your diagnoses to a third party, no such limitation applies to apps or wearable devices that collect the same kinds of data.

A handful of states have additional laws on the books. Illinois, for example, adopted a prescient law back in 2008 that regulates the collection and use of individuals’ biometric data. Facebook since 2015 has been embroiled in a class-action lawsuit in that state over its use of facial recognition.

The biggest player at the state level is California, which in 2018 adopted a sweeping privacy law that would give individuals more control over how their personal data is collected, used, and sold. That law has survived several attempts by opponents to weaken its key provisions, and it goes into effect on January 1.

Representatives Eshoo and Lofgren are far from the first to propose new federal legislation to address the morass. In fact, they’re not even the first this year. Sen. Ron Wyden (D-Ore.) last month introduced the Mind Your Own Business Act, which not only seeks to introduce new standards for user privacy and how data is handled, but would also impose criminal penalties, including jail time, on the leadership of companies that fail to comply.

Sen. Marco Rubio (R-Fla.) also introduced a privacy-related bill earlier this year. His American Data Dissemination Act would create a process and timeline for the Federal Trade Commission to establish privacy rules, rather than actually establishing new rules. It would also prohibit any state from enforcing its own law related to the same kinds of data as the federal law, something many big tech companies strongly support.

https://arstechnica.com/?p=1597031