My biggest concern with how fact-checking has played out is that fact-checkers themselves have gotten into an "us vs them" perspective. With "us" being the fact-checkers themselves. This is a problem when they are analyzing claims about how fact-checkers got stories wrong. Such as the year they spent shutting down all discussion of the lab leak hypothesis.
I've noticed this contradiction very strongly across misinformation studies. For example Sander der ver Linden, author of https://www.amazon.com/Foolproof-Misinformation-Infects-Mind..., regularly uses such claims about fact checkers as an example of right-wing misinformation. But it isn't.
Fact-checking as a profession can only work if they figure out how to get past their own internal biases. Unfortunately, they haven't. And the fundamental reason for that is that when our emotions engage, it is easy for us to think we've made a logical decision when we've actually made an emotional one. None of us are immune to this. And the emotional tools to solve it tend to mostly be discussed in professions where it is harder to fool others about your objective mistakes.
See https://paulgraham.com/identity.html and https://blog.codinghorror.com/the-ten-commandments-of-egoles... for examples of how good programmers address their own inherent biases. Programmers are one of the professions that deals with this, because we can't hide when it is our code that has the bug.