Musk buying TikTok would be against competitive tension in already untrustworthy social-media market + Meta (1)

Opinion by Mathew Carr

Jan 16, 2025 — It would be yet another step too far.

Elon Musk, owner of X, should not be allowed by the US to buy TikTok in that nation because it would reduce competition in the market for social media. These two companies are already too big.

https://news.sky.com/story/tiktok-reacts-to-report-of-musk-ownership-to-prevent-us-ban-13288673

Musk is also too conflicted because of his various companies and role in the new administration of President Donald Trump.

It’s a stark example of what Joe Biden is warning about as he leaves office:

Times of London:

Biden speech: farewell address warns of ‘oligarchy’ in America
https://www.thetimes.com/article/8fea14fa-cb07-4902-96d3-f55b4722b377?shareToken=2d4ec605288756715a7af676a9007f96

Or broker?

FT on Wednesday

Musk’s X and Meta will be key beneficiaries anyway if TikTok dies….as TikTokkers find new amusement.

The Tik Tok ban should be overturned, if possible.

————-

Note

Might Meta be trying to reduce the risk of lawsuits because of disinformation as it fires fact checkers and adopts X’s Community Notes system?….it can say Meta didn’t fail …it was the “community notes” that failed …ie it can blame its own customers / users for the disinformation ?

Chat GPT:

Yes, Meta’s shift from professional fact-checkers to a community-driven system like “Community Notes” could indeed be a strategic move to reduce its legal exposure to lawsuits related to disinformation. Here’s how this approach might serve to shield Meta:

1. Shifting Responsibility to Users

• By relying on user-generated content for fact-checking, Meta can argue that it is not directly responsible for inaccuracies or failures in addressing disinformation. Instead, it can claim that the system is a reflection of collective input from its users, distancing itself from accountability.

• If misinformation spreads despite the “Community Notes” system, Meta can point to users’ contributions as the cause, rather than its own policies or negligence.

2. Avoiding Defamation or Liability Claims

• Employing professional fact-checkers might make Meta more directly liable for flagged content. A party could argue that Meta, through its fact-checking program, made a deliberate decision to label content as false or misleading.

• With “Community Notes,” Meta is not explicitly endorsing any single claim but instead curating community input, reducing the perception that it is making editorial decisions—a key trigger for legal liability.

3. Legal Protections Under Section 230

• In the U.S., Section 230 of the Communications Decency Act generally protects platforms like Meta from being held liable for user-generated content. By emphasizing community-driven systems, Meta leans more heavily on these protections, framing itself as a neutral platform rather than a publisher or arbiter of truth.

4. Deflecting Public Criticism

• Meta has faced significant scrutiny for failing to combat disinformation effectively. A decentralized, user-driven system allows Meta to claim that it is empowering its users while reducing its direct responsibility to fact-check every piece of content.

• If disinformation persists, Meta can argue that it provided tools for users to engage and correct the record, thereby deflecting criticism.

5. Cost and Risk Reduction

• Replacing professional fact-checkers with a community-driven system lowers operational costs while also avoiding the potential backlash or lawsuits arising from fact-checkers’ decisions (e.g., accusations of bias, censorship, or errors).

• Outsourcing this responsibility to users minimizes Meta’s exposure to lawsuits that could allege intentional suppression of viewpoints.

Caveats to This Strategy

• While the move might reduce short-term legal risks, it could backfire if Community Notes fails to adequately address misinformation, potentially leading to lawsuits or regulatory scrutiny for negligence.

• Governments or critics might argue that this approach reflects a lack of accountability, especially if the system amplifies harmful content.

Meta’s strategy appears to be a calculated effort to manage legal risks, reduce costs, and maintain plausible deniability in the face of ongoing challenges around misinformation and disinformation. However, whether this will satisfy public and regulatory demands for accountability remains uncertain.

Leave a Reply