3 Comments

The challenge with social media platforms is that their algorithms promote the most morally outraged content that puts two groups against each other bc that content keeps people engaged.

The platforms need to be held accountable for promoting the toxic content not for having it on the site.

I think combining cyber bullying and CSAM in the same hearing creates confusion. Which is why nothing gets done.

There are regulations for CSAM and lots of tech to weed it out.

There are not regulations for designing algorithms that promote division. And the latter is the biggest issue.

Expand full comment

The politicians were doing what they do best: grandstanding. Expecting anything else is…err…fanciful.

Expand full comment

Charles, disclaimer, I'm not a lawyer, but I've studied Section 230. The issues are a complicated mess with absolutely enormous money stakes. But I'd say this isn't a good gloss: "But you can't be sued for what people post, unless (and not necessarily even then) you're put on notice that it's in some way offensive to someone.". The key point isn't about "offensive", which sounds too mild. It's more about "allegedly violates some law", and what happens then.

Note, literal CSAM content is actually one of the black-letter legal exceptions to Section 230

https://www.law.cornell.edu/uscode/text/47/230

"Nothing in this section shall be construed to impair the enforcement of ... 110 (relating to sexual exploitation of children) of title 18, or any other Federal criminal statute."

Though I agree with you these hearings are circus. I think it's that politicians do get politics, that's their job. Wrestling with real Section 230 reform is hard, and will get any legislator who does it slammed with all the resources of some of the biggest corporations in the world (again, because of the money involved). Holding a media event for everyone to do meaningless posturing is very easy.

Expand full comment