The wake up call, I think, is recognizing how others who completely disagree with you feel the same way about you and your beliefs.
I disagree with this. Sure, both sides in major disagreements often actively dislike each other because of the perceived damage the other side is doing. But it's perfectly plausible and actually happens quite often that one side is mostly right and the other side is mostly wrong. And when the correct side is actually right, just seeing that the other side thinks bad things about the correct side isn't particularly useful, in my opinion.
This doesn't mean you shouldn't put forth effort to understand the other side, and more importantly, verify that you're on the correct side in such a dispute.
But this should be the case all the time: you should always be checking your ideas, even when there is no active dispute among most of the population. There's been plenty of times in the past where "everyone knows" something, only to find out that everyone was wrong. No special signal of disagreement should be necessary to trigger such questioning.
The "trick" in all this is to be rigorous in your collection of data and weighting of it based on its plausibility. Plausibility testing can be done in many ways: one of the basic ones is doing corollary analysis, such as "if this is true, what else would need to be true/false", and how reasonable are all those corollaries. This is where I think most people fail when forming opinions. They are too eager to accept data as true and find it difficult to maintain a certain amount of skepticism. And a few people suffer from the reverse problem, where they reject all data as untrustworthy, even when much of the data they have forms a plausible view of reality.
Plausibility analysis can also be done based on an information source's accuracy rating, but this is often more difficult than corollary analysis, since this first requires an often difficult vetting of sources. People often use this method when choosing what to belief, but they often don't do a good job of rating their information sources.
A proper vetting of sources would include things such as historical accuracy of sources, potential conflicts of interest of sources, etc. The reputation system I want to build will be designed to help in this latter endeavor, especially nowadays, when much of our data comes from sources on the internet that are difficult to get accurate information about.
As a final note, it probably goes without saying, but it's very important when doing either of these forms of analysis to struggle diligently against confirmation bias. Confirmation bias takes place when you give more weight and belief to the data and opinions which tend to confirm a desired belief you have. Because almost everyone is subject to some amount of confirmation bias, it's actually good to spend extra time trying to confirm the opposite to your desired beliefs whenever possible.
When there is a disagreement among people about a specific belief, politely talking to people with the opposite opinion can be a useful way to double check your beliefs, but when doing so, it's important to pick the best proponents of the belief and not simply the ones that are the easiest to defeat in an argument.
Another way to attack confirmation bias is by "unlinking" beliefs that aren't logically related. Many of our beliefs are "group beliefs" where we form a common set of beliefs with a group we identify with. Often these beliefs have no logical relationship, they just happen to be held by most of the group. This tends to happen out of a desire to "fit in" and also because we get many of our data and belief from the people we commonly interact with, so if we mostly interact with one group, most of our information/beliefs come from that same group.
If you find yourself mostly agreeing with one group of people, and there is no logical linking between the beliefs of the group, it's a good idea to double-check such beliefs carefully, as you've very likely been the victim of confirmation bias.
I think you stated a version of what I stated in terms of the wake up call is recognizing the "other side" may feel the same negative emotions towards you as you do towards them. Recognizing this, I think, makes it possible to have the polite conversations that are so important for more clear thinking and understanding.
Just as you said, throughout history "everyone was right" even though they were wrong. That's why I hesitate to hold too strongly to "one side is mostly right and the other side is mostly wrong." Does that mean I'm ready to give time and attention to flat-earthers? Well, no, but does that remove the possibility that we are actually living in a simulation and those who think of the earth as "flat" may actually be connecting to some useful truth about the nature of reality? Certainly possible for those who can think in multiple dimensions and with nuance.
So much of what you said here is valuable. Thank you, Dan. Lots of confirmation bias going on. Trying to disprove our own beliefs is one of the few ways we actually gain knowledge as described in this simple game:
My friend Sean King has a nice list of life maxims which help him with his plausibility analysis. I think that's a great pattern to use. Find some wise, useful tools for clear thinking and stick to them, always being open to re-evaluate them and refine them over time, but ultimately settling into good thinking patterns which help us all be less wrong.