Epistemology is the study of knowledge. The dictionary defines it as “that department of philosophy which investigates critically the nature, grounds, limits, and criteria, or validity, of human knowledge; Theory of cognition.” Woody Allen once called it the intellectual discipline that asks the question, “can we know what we know and if not, how do we know that?”
Allen’s tongue-in-cheek formulation notwithstanding, most people will dismiss the subject as esoteric and obscure — a suitable topic for an uninspired term paper from a bored undergraduate, perhaps, but nothing much to worry about in the “real world.” Yet, its seeming irrelevance is deceptive because the things we think we know profoundly influence the conduct of everyday life.
A recent editorial in “The Annals of Internal Medicine,” for instance, references the findings of a meta-analysis of several studies regarding the efficacy of over-the-counter dietary supplements. The editors conclude, “We believe that the case is closed — supplementing the diet of well-nourished adults with mineral or vitamin supplements has no clear benefit and might even be harmful. … These vitamins should not be used for chronic disease prevention. Enough is enough.”
The reviewed studies found no health benefit whatsoever from taking a daily multi-vitamin and discovered that mega-doses of Vitamin A, E and Beta-Carotene did nothing to prevent cancer and actually corresponded with higher incidences of the dreaded disease.
The latter finding was unsurprising because it has long been known that too much of a good thing can lead to bad outcomes. Most doctors now agree that a glass or two of red wine daily promotes heart health but few would recommend 10-20 glasses per sitting. The finding about multi-vitamins, however, left me in something of an epistemological bind because I’ve taken one every morning throughout my adult life and have generally enjoyed good health.
Of course, my experience is anecdotal and thus cannot be used as the basis for generalization. The only way I could scientifically validate it would be to relive the past 40 years without artificial vitamin intake and then compare my unsupplemented health status with its present state. As that experiment requires mastery of time travel, I won’t hold my breath. So, I’m left with a quandary.
On the one hand, I’ve long argued the need to subject belief to empirical test. Even obvious truth has to be validated before it can be relied upon. After all, from ordinary observation it’s apparent that the sun revolves around the earth but Copernicus long ago demonstrated that is not the case.
On the other hand, it’s difficult to deny the facts before your eyes regardless of their subjectivity. When I started taking vitamins, my vitality and endurance seemed to improve. The effect might have been psychosomatic, but it was real to me.
Well nourished
For now, I’ve provisionally resolved the dilemma by concentrating on the adjective “well-nourished” in the editorial. Read dietary recommendations from the experts and you’ll discover that their idea of an adequate lunch involves grazing in the produce section of your local supermarket for an hour or two.
Personally, I don’t think I’ve ever purchased a head of red cabbage. Asparagus, beets and peas make me nauseous while broccoli induces flatulence. For people like me who often eat on the run or skip meals altogether, maybe a daily supplement helps to fill the gaps. Then again, maybe not…
Minimum wage
Though I’m presently uncertain about the shot in the arm provided by multi-vitamins, I’m able to report with greater clarity about a controversy in the so-called “dismal science” of economics.
The president has proposed raising the federal minimum wage to $10.10 an hour and has already imposed the hike on federal contractors via executive fiat. To make the increase universal, however, will require congressional action.
Proponents argue that a higher minimum boosts the economy by putting more money in circulation. As low-wage workers tend to spend all of their earnings on household expenses, the theory goes, giving them more money directly stimulates economic activity.
The argument against the increase is two-fold: raising base pay is inflationary, which is true and such a raise will result in a net loss of entry-level jobs, which is not.
By definition, putting more cash into circulation is an inflationary measure. When money is cheap, prices tend to rise. But inflation hasn’t been a serious problem for more than a decade so that shouldn’t be an immediate concern.
The second objection is based on the proposition that if demand for labor is stagnant and the price of labor rises, employers will hire fewer workers. Superficially, that appears to make sense, but the premise is refuted by an abundance of historical data.
The federal minimum wage was instituted in 1938 during the midst of the Great Depression when the demand for labor was anything but robust. It has subsequently been raised 25 times, most recently in 2009. We thus have over 70 years of data to draw upon when gauging its effect. The predicted job loss has never — ever — materialized.
The explanation for that seeming anomaly is simple: Employers don’t hire laborers because they can afford to; they hire them because they can’t afford not to. Businesses take on employees because they need them to generate profit, which is why upper-level tax breaks fail to create the jobs that broadly based increases in demand invariably generate. The optimum size of the raise may be debatable; but the case for raise, itself, is substantiated by a wealth of empirical data.
The only real uncertainty is whether low-wage earners should be encouraged to use their proposed windfall to buy vitamins.