How do we know what we can know, or how we should know, or what constitutes "knowing" in the first place? How certain can we be that our knowledge is correct? Good questions all, none of which I can answer; but here are a few recent articles related to such conundrums.
At Aeon, John Schwenkler points out the difficulties in deciding whether or not we should ever venture forth from our epistemic bubbles and expose ourselves to the contagion of different ideas:
When we consider how a certain choice would alter our knowledge, understanding or ways of thinking, we do this according to the cognitive perspective that we have right now. This means that it’s according to our current cognitive perspective that we determine whether a choice will result in an improvement or impairment of that very perspective. And this way of proceeding seems to privilege our present perspective in ways that are dogmatic or closed-minded: we might miss the chance to improve our cognitive situation simply because, by our current lights, that improvement appears as a loss. Yet it seems irresponsible to do away entirely with this sort of cognitive caution. How much is too much, though, and when is this caution appropriate? And is it right to trust your current cognitive perspective as you work out an answer to those questions? (If not, what other perspective are you going to trust instead?)
Not to give away too much, but in the course of his essay, Schwenkler persuasively argues that, if you're planning to go to the grocery store, you should first check to make sure the store will be open; this means the essay is a rare example of a philosophical discourse that has practical application.
https://aeon.co/ideas/should-you-shield-yourself-from-others-abhorrent-beliefs
In an essay (at the Weekly Standard) that is simultaneously entertaining and provocative, Daniel Sarewitz asks how we can "demarcate" science from non-science or good science from bad science:
It’s one thing for theoretical physicists to chase the wrong theory about fundamental particles for 25 years with nothing to show for it but high-prestige publications. That’s fun. But what can be said after a long series of clinical trial failures suggests that neuroscientists have been chasing the wrong theory for Alzheimer’s disease for 25 years with nothing to show for it but high-prestige publications? When hundreds of published breast-cancer studies turn out to be based on contaminated samples, when thousands of brain-imaging studies turn out to be statistically flawed, when economic theory continues to build on assumptions about human behavior that are known to be wrong, it becomes rather difficult to understand how one can separate self-correction from bad science from nonscience from delusion from corruption.
A healthy dose of skepticism is required, to be sure, and in the end the average person may just have to pick a side and hope for the best:
Science cannot be cleanly demarcated from nonscience, and much of what we are hoping that scientists can tell us these days—about nutrition and health, about economics, the environment, education, aging, and the origins of the universe—will emerge from the vast fuzzy area between the two. Arguments over the results and implications of such work will be never-ending and will be peppered with accusations that one side or the other is being unscientific. What we nonexperts choose to believe about such matters will depend much more on whom we trust and what we find to be helpful than on what can be known to be true.
https://www.weeklystandard.com/daniel-sarewitz/all-ye-need-to-know
Finally, writing at Philosophy Now, George Dunseth offers "Twelve Principles of Knowledge" to help us distinguish whether or not particular ideas are true. Mr. Dunseth seems understandably proud of his achievement:
As I began to think and exchange ideas I soon realised that it is important to be reasonable and rational. But I then felt a powerful need to understand what that means. And so I began making a lifelong, constantly revised, simple list of how all of us support our truth claims.
What counts as evidence for truth in rational argument? I have attempted to be simple, clear and exhaustive. These principles can be printed on a piece of paper and posted proudly on your refrigerator. They apply both to the sciences and the humanities, since science does not have a monopoly on reason.
None of the principles are sufficient in themselves, and some are clearly stronger and more warranted than others. The more of them that apply to your claim, the more warranted your truth claim is – we could even say, the more reasonable it is.
The twelve rules can indeed be posted on your refrigerator, or perhaps reproduced onto a laminated card you can carry in your wallet. What better way to settle the typical barroom dispute than to whip out your copy of "Twelve Principles of Knowledge" and subject your opponents' claims to detailed scrutiny?
Of course, the "loser" of such a contest could always quote Dunseth's own conclusion: "I think it is wise to be open to the idea that there may be truths inaccessible to reason or outside its parameters." That loophole should be adequate for almost any kind of nonsense to slip through, and it will therefore surely come in handy for most of us.
https://philosophynow.org/issues/124/Twelve_Principles_of_Knowledge
Posted by: |