I just finished a very enjoyable book, A Short History of Nearly Everything, by Bill Bryson. The book presents, in layman’s terms, what we know about the Earth, the universe, our ecosystem, and the human race through the history of how we learned it all. Or, more specifically, how the people who figured it all out did so. The science you learned in high school is more than enough to access and understand the book, and I quite recommend it.

While I did come away knowing a good deal more than I previously did about the topics the book covered, the most interesting takeaway for me was the enormous role that individuals’ peculiarities, personalities, egos, stubbornnesses, rivalries and other elements of human nature played in the history of science and the accumulation of knowledge. Examples are myriad, and I won’t spoil the book’s charms by trying to repeat them here. Broadly speaking, the history of human scientific research and discovery is awash in bad theories that lingered far longer than they should have because people built careers and reputations on them, in discoveries that languished for decades because people were secretive or shy or retiring or meek, in professional rivalries that stood in the way of progress, in condescensions towards those of lesser pedigree, incorrect gender, inadequate resume or credentials, or insufficient genuflection that led to the overlooking, ignoring, or rejecting of breakthroughs and insights, and in circumstances and tragedies that deflected or cut short insights and breakthroughs. In short, even geniuses and scientists of great esteem are, first and foremost, human beings with human failings.

This lesson remains quite relevant today, despite the far greater accessibility and sharing of information that modern technology provides, because human nature hasn’t changed. Indeed, the “failings” that make even the greatest minds something other than dispassionate, Spock-like seekers of objective truth are exacerbated by this modern technology and the nature of modern research. Whereas the work of many of the great minds of the past were funded either by their own wealth or by patrons, researchers today overwhelmingly rely on outside money guided by outside interests, and while it’s fun to presume that “the bad guys” are the only ones who are tempted to pressure researchers to certain outcomes, that’s just self-delusion. He who pays the piper calls the tune, and those who fund research often feel pressures of their own in certain politically-charged fields.

In addition, the “gotcha” nature of public discourse nowadays makes it extremely difficult for those who publicly avow a particular view or theory or conclusion to, later on, modify it, even when new information makes it proper to do so. Many people put agenda ahead of truth, especially when truth is complicated or nuanced or not fully developed or too complex for a layman to fully comprehend, and many such have shown no compunction about lobbing personal attacks at those who cross their agenda or who abandon positions that supported that agenda.

The flip side of this matter is found among those who elevate science and scientists to a mythical, dogmatic, or quasi-religious level. Who open statements with “Scientists say…,” consider what follows as inviolate and sacrosanct truth, reject any rebuttal, and cite credentials as proof of accuracy. Doing so is the classic appeal-to-authority logical fallacy. What is stated as true might very well be true, but the mere citing of someone with a title or degree as proof of that truth is anything but.

Does any of this mean we should not believe scientists when they tell us stuff? Not in the slightest. That would be as fallacious an act as believing them solely because they are scientists. Nor should we hold a blanket bias against them and that which they say. In general, we have far more reason to trust what scientists present as scientifically derived conclusions than the twaddle perpetuated by hucksters, woo-natics, astrologers, new-agey crystal worshipers, mystics, seers, and the like. BUT, and this is the lesson, we should not blindly presume that all that a great mind concludes or declares is unassailable. They are as human as the rest of us, as (or more) susceptible to ego and stubbornness as we are, and quite possibly wrong (as many of the greatest minds in history have been). There are objective truths in the universe, and we know some of them. But, there is a tremendous amount that we do not know, and there are things that we “know” that may very well be wrong. Science is an ever-evolving field, and what’s known today may be disproved tomorrow. It is an unfortunate reality that such disproofs can unjustly destroy careers and reputations, simply because people are people, and thus there’s often enormous pressure on those whose careers and reputations would face such peril to resist that which may very well be correct but would do them harm.

When it comes to controversial topics, ones with social, political, or religious overtones, or ones where conclusion leads to policy, it is incredibly hard to have an open mind and be receptive to new ideas and evolving theories. It is, however, vitally important that we do. Not to the point of entertaining the baseless and the crackpot – a properly-reached set of beliefs and conclusions should not be discarded merely because someone says “boo” – but a stubborn refusal to consider new information after coming to a conclusion is just as wrong. The history of science and scientists makes that abundantly clear.

Peter Venetoklis

About Peter Venetoklis

I am twice-retired, a former rocket engineer and a former small business owner. At the very least, it makes for interesting party conversation. I'm also a life-long libertarian, I engage in an expanse of entertainments, and I squabble for sport.

Nowadays, I spend a good bit of my time arguing politics and editing this website.

If you'd like to help keep the site ad-free, please support us on Patreon.

0

Like this post?