@lain In the end, probably the best outcome from applying this principle is that such large estates that cannot be easily patrolled and kept free from unwanted intruders with catapults become more onerous to own than they're otherwise worth due to the risks they expose the owner to, which comes back to favorizing decentralization.
@lain Within the metaphors of free association, if someone goes to your backyard, sets up a catapult and starts lobbing rocks at the neighbors, you can expect to get some grief from the neighbors. If you have a reasonable explanation, such as that you are alone, it's a big backyard and were on vacation and could not have possibly noticed and stop the catapult before it started throwing rocks, then I expect they would understand. But if you have a large staff, security cameras everywhere and in the same time period you've expelled tens of people who weren't setting up catapults in your yard, then I think it's not unreasonable to start thinking maybe you should be held responsible for allowing the catapult guy to setup on your land.
@lain I agree in the sense that free association is the superior principle here, but with one caveat that I simply do not think the current hypocrisy of "we can't be expected to be responsible for user content, it's impossible without drastic measures!", and then proceeding to heavily curate user content anyway should be tolerated. Within the context of the state and legal system being here and probably being here for the foreseeable future, I expect and demand consistency.
You don't really have to, you could easily modulate the expectation of moderation in a good faith clauses to the ability of the party in question. The ability of a small blog or small fedi server to moderate content that opens them to liability is not the same as that of a big corporation. While infringing content could be reasonably left for a week on a small blog because the only moderator is on vacation and couldn't be expected to remove it, a large corporation with a staff that's watching 24/7 wouldn't be given the same benefit of the doubt if they ignored for a week something that's been flagged for their moderators to look at (I guess unless they can give a really good reason as to why no one had a chance to take it down).
Another possible type of moderation that I could see granting for companies that chose to be conduits for speech rather than curators, is that of off-topic speech, provided the scope of what is on-topic is clearly delineated and that the application of the moderation is not fickle but even-handed. If a site wants to be an echo chamber, they have to own it, say it, and apply it fairly.
That used to be the case, but now they do have the tools and the manpower to keep watch essentially 24/7, they keep demonstrating it every day. Good faith clauses could also be given, providing them with reasonable chance to do moderate speech that opens them to liability.
I know this this would suck, I don't WANT them to take this route, and I don't think they would, but to avoid the charge that this is forcing companies to act like a government, they have to be given that option.
I would much rather they decide to broadly allow content that is legal and not breaking the site on a technical basis and avoid all this. It's really not that hard for them to be; just don't hire activists to do your moderation, fire them when you catch them doing politically motivated moderation.