@louis I think we actually have to think about that, in an application-specific way. If a library makes a book available, is the library liable? Potentially yes, but we put pretty wide bumpers around that, because we see the harms of censoring books to be a greater hazard than making them available, and so choose a direction to err. 1/

in reply to @louis

@louis I suspect that for search engines, we’d make a quite similar choice, but we would apply more scrutiny to say TikTok. Customized content on a large platform is “complicated” relative to simple mass-broadcast, but the scale of potential harms can be similar, and I’m not sure why we’d want complications to become exonerations. Are the benefits of these institutions so great we want to bear more harms? That’s a value judgment we get to collectively make. 2/

in reply to self

@louis That said, we did deal with these issues, with television and film, and the bar to liability was pretty high. Getting rid of the blanket shield in Section 230 doesn’t mean any little thing will get you sued. Here’s a law review lamenting how hard prosecuting very similar events was during the 1980s. There was no Section 230. Courts still tended to err on the side of not chilling speech. 3/ digital.sandiego.edu/cgi/viewc

in reply to self

@louis ( Are you old enough to remember this event? I don’t know if there was ultimately any liability. archive.is/xxoYY ) /fin

in reply to self