<
div class=”field field–name-body field–type-text-with-summary field–label-hidden”>
<
div class=”field__items”>
<
div class=”field__item even”>
The Supreme Court correctly found that social media platforms, like newspapers, bookstores, and art galleries before them, have First Amendment rights to curate and edit the speech of others they deliver to their users, and the government has a very limited role in dictating what social media platforms must and must not publish. Although users remain understandably frustrated with how the large platforms moderate user speech, the best deal for users is when platforms make these decisions instead of the government.
As we explained in our amicus brief, users are far better off when publishers make editorial decisions free from government mandates. Although the court did not reach a final determination about the Texas and Florida laws, it confirmed that their core provisions are inconsistent with the First Amendment when they force social media sites to publish user posts that are, at best, irrelevant, and, at worst, false, abusive, or harassing. The government’s favored speakers would be granted special access to the platforms, and the government’s disfavored speakers silenced.
We filed our first brief advocating this position in 2018 and are pleased to see that the Supreme Court has finally agreed.
Notably, the Court emphasizes another point EFF has consistently made: that the First Amendment right to edit and curate user content does not immunize social media platforms and tech companies more broadly from other forms of regulation not related to editorial policy. As the Court wrote: “Many possible interests relating to social media can meet that test; nothing said here puts regulation of NetChoice’s members off-limits as to a whole array of subjects.” The Court specifically calls out competition law as one avenue to address problems related to market dominance and lack of user choice. Although not mentioned in the Court’s opinion, consumer privacy laws are another available regulatory tool.
We will continue to urge platforms large and small to adopt the Santa Clara Principles as a human rights framework for content moderation. Further, we will continue to advocate for strong
[…]
Content was cut in order to protect the source.Please visit the source for the rest of the article.
Read the original article: