There are infinitely many people in existence, unable to communicate with one another. An angel makes it known to all that if, and only if, infinitely many of them make some minor sacrifice, he will give them all a great benefit far outweighing the sacrifice. (Maybe the minor sacrifice is the payment of a dollar and the great benefit is eternal bliss for all of them.) You are one of the people.
It seems you can reason: We are making our decisions independently. Either infinitely many people other than me make the sacrifice or not. If they do, then there is no gain for anyone to my making it—we get the benefit anyway, and I unnecessarily make the sacrifice. If they don't, then there is no gain for anyone to my making it—we don't get the benefit even if I do, so why should I make the sacrifice?
If consequentialism is right, this reasoning seems exactly right. Yet one better hope that it's not the case that everyone reasons like this.
The case reminds me of both the Newcomb paradox—though without the need for prediction—and the Prisoner's Dilemma. Like in the case of the Prisoner's Dilemma, it sounds like the problem is with selfishness and freeriding. But perhaps unlike in the case of the Prisoner's Dilemma, the problem really isn't about selfishness.
For suppose that the infinitely many people each occupy a different room of Hilbert's Hotel (numbered 1,2,3,...). Instead of being asked to make a sacrifice oneself, however, one is asked to agree to the imposition of a small inconvenience on the person in the next room. It seems quite unselfish to reason: My decision doesn't affect anyone else's (I so suppose—so the inconveniences are only imposed after all the decisions have been made). Either infinitely many people other than me will agree or not. If so, then we get the benefit, and it is pointless to impose the inconvenience on my neighbor. If not, then we don't get the benefit, and it is pointless to add to this loss the inconvenience to my neighbor.
Perhaps, though, the right way to think is this: If I agree—either in the original or the modified case—then my action partly constitutes the a good collective (though not joint) action. If I don't agree, then my action runs a risk of partly constituting a bad collective (though not joint) action. And I have good reason to be on the side of the angels. But the paradoxicality doesn't evaporate.
I suspect this case, or one very close to it, is in the literature.