@louis I think we actually have to think about that, in an application-specific way. If a library makes a book available, is the library liable? Potentially yes, but we put pretty wide bumpers around that, because we see the harms of censoring books to be a greater hazard than making them available, and so choose a direction to err. 1/
@louis I suspect that for search engines, we’d make a quite similar choice, but we would apply more scrutiny to say TikTok. Customized content on a large platform is “complicated” relative to simple mass-broadcast, but the scale of potential harms can be similar, and I’m not sure why we’d want complications to become exonerations. Are the benefits of these institutions so great we want to bear more harms? That’s a value judgment we get to collectively make. 2/
@louis That said, we did deal with these issues, with television and film, and the bar to liability was pretty high. Getting rid of the blanket shield in Section 230 doesn’t mean any little thing will get you sued. Here’s a law review lamenting how hard prosecuting very similar events was during the 1980s. There was no Section 230. Courts still tended to err on the side of not chilling speech. 3/ https://digital.sandiego.edu/cgi/viewcontent.cgi?article=1659&context=sdlr
@louis ( Are you old enough to remember this event? I don’t know if there was ultimately any liability. https://archive.is/xxoYY ) /fin