The #federation model is effectively collaborative #moderation. The behaviour of every individual is too hard to track, but the limits of what a particular server considers permissible is more easily discovered. A user can pick an instance based not only on their own instance's limits, but how their instance handles federation with instances with significantly different limits.
Importantly, this model includes transparency and accountability. By contrast, shared blocklists address the scalability problem, but do so entirely without accountability. One of the largest shared blocklists on Twitter systematically blocks trans people, especially trans women. This list came about as a way to address the non-moderation of Twitter, but then, without accountability, became a tool furthering inequality and invisibilising marginalised people. What in scuttlebutt prevents this?