How to see into the future, by Tim Harford
The article may be gated. (I have a subscription through my school.)
It is mainly about two things: the differing approaches to forecasting taken by Irving Fisher, John Maynard Keynes, and Roger Babson; and Philip Tetlock's Good Judgment Project.
Key paragraph:
So what is the secret of looking into the future? Initial results from the Good Judgment Project suggest the following approaches. First, some basic training in probabilistic reasoning helps to produce better forecasts. Second, teams of good forecasters produce better results than good forecasters working alone. Third, actively open-minded people prosper as forecasters.
But the Good Judgment Project also hints at why so many experts are such terrible forecasters. It’s not so much that they lack training, teamwork and open-mindedness – although some of these qualities are in shorter supply than others. It’s that most forecasters aren’t actually seriously and single-mindedly trying to see into the future. If they were, they’d keep score and try to improve their predictions based on past errors. They don’t.
I don't think it was the case of "I have a telescope ergo I am correct", I think it was more of the case "Here, look into this thing and see for yourself".
I was mostly trying to talk about an "outside view", i.e. whom should a layman (who is not necessarily able to immediately replicate an experiment himself/herself) believe?
Suppose an acclaimed professor (in earlier times - a famous natural philosopher) and a grad student (or an equivalent in earlier times) are are trying to figure out something and their respective experiments produce differing results. Suppose their equipment was of the same quality. Whom being correct should a layperson bet on before further research becomes available? Would even the grad student himself/herself be confident in his/her result? Now suppose the grad student had access to a significantly better and more modern tools (such as a telescope in the early 1600s or an MRI scanner in 1970s, etc.). The situation changes completely. If the difference between the quality of lab equipment would be sufficiently large (e.g. CERN vs an average high school lab) nobody would even bother to do a replication. (by the way, given equipment of the same quality (e.g. only senses), if the difference in authority is sufficiently large, would the situation be analogous? I'm not sure I can answer this question).
A more mundane situation. Suppose a child claims there is some weird object in the sky that they saw with their naked eye. Then others would ask, why hadn't others (whose eyes are even better) seen it before? Why hadn't others (who are potentially more intelligent) identified it? Now suppose the said child has a telescope. Even if others would not bother to look at the sky themselves, they would be much more likely to believe that he/she could have actually seen something that was real.
In no way am I trying to downplay the importance of replications and, especially, cheap replications, such as allowing everybody to look through your telescope. (which, in addition to being a good replication of that particular observation, also serves a somewhat more general purpose - people have to believe that you really do possess "extended sense" instead of just making it up (like many self-proclaimed psychics do)). The ability to replicate cheap experiments is crucial. As well as the fact that (in the ideal world, if not necessarily the real one) there are people in the world who have the means necessary to replicate difficult and expensive ones, and the willingness (and/or incentives) to actually do so and honestly point out whatever discrepancies they may find.
It seems necessary to point out that this is probably just a "just-so story", an actual historian of science probably could make a much more informed comment whether the process I described was of any importance at all.
Anyway, this conversation seems to have strayed a bit off topic and now barely touches the Financial Times article.