you're reading...

Science and technology

Scientific Results vs. Common Sense

0 Flares 0 Flares ×

The advent of the age of science has allowed for numerous technological advancements. The eternal search for truth has channeled itself into modern science in the past few hundred years. The study of the physical sciences has existed since time immemorial as defenseless man sought to control and live in nature. In the past few hundred years, it has finally culminated in the pinnacle of technological progress in human society. But given all the technological progress, it has now become a new religion, simply replacing the church, and not providing anything more of substance. That is not to say that the scientific method does not work, but simply that it has been extended beyond it’s useful capacity and thus created an atmosphere of blind acceptance of scientific results, the complete anti-thesis of the scientific process which is based, above all, on skepticism.

To truly understand science, one needs to go back to the very first principles – ie. the scientific method. One forms a hypotheses, designs an experiment or otherwise collects data, and then verifies if the hypothesis is correct. Essentially, the scientific method is a fancy way of saying – “guess and test”. So, without going to the realm of philosophy, scientific realism, epistimology, what can be known, what cannot be known, let’s just assume that on the most basic level, the scientific method gives an approximate description of reality over long periods of time. Unfortunately, over short periods of time, however, science is as reliable as astrology.

One only needs to look as far as some of these pseudo-sciences. These simply follow the forms of “science”, but due to inadequate data, overspecialization, lack of predictive power, assumptions of constant theoretical improvement, lack of controlling all environmental characteristics, or lacking falsifiability (Popper), fail quite miserably in showing anything whatsoever that corresponds to reality. Unfortunately, due to the divine infallibility of the oracle of modern science in present culture, what should be common sense is now obscured. Perhaps, in time, these elements may evolve into something that has predictive power, but at present, we must be very cautious in examining the results of these “sciences”.

Anyway, a few prime examples come to mind when I think of sciences that give results that cannot be trusted are below.

Food Science

Human diet has evolved over the centuries, with civilization progressing from hunter/gatherer to an agricultural mode of living. We have isolated various compounds over time – from vitamins to minerals that are necessary for life, but have also created an abundance of confusion.

Eggs: There are an absurd number of articles from 1950 to present day that just don’t solve the problem of whether one should eat eggs. Historically, eggs were part of a healthy diet and a good source of protein, but all this changed around 50 years ago when it was found that eggs were a source of cholestorol and apparently increase the level of serum cholestorol which is an indicator of heart disease. Then, if I recall correctly, there was the advent of good and bad cholestorol, and the eating of eggs was rendered good again. Note at the levels of indirection – it never was shown that eating a good deal of cholestorol actually would increase the serum cholesterol or that increased serum cholestorol is a necessary condition of heart disease, both conditions which are necessary for any certainty whatsoever about the conclusion.

Modern literature states that eating more than one egg a day is related to an increased risk of heart failure[i], regular egg consumption (average of one or more per day) does not increase the risk of stroke and cardiovascular diseases[ii], data from free-living populations show that egg consumption is not associated with higher cholesterol levels and the epidemiologic literature does not support the idea that egg consumption is a risk factor for heart disease[iii], and that the serum cholestorol level did not change in the majority of the study group after regular consumption of two eggs/day though some exhibited increases and others decreases[iv], and consumption of up to 1 egg per day is unlikely to have substantial overall impact on the risk of heart disease[v]. And we haven’t even gone into whether only the yolk should be eaten, whether it should be raw or cooked, or whether eggs are only healthy on rainy Sundays with a good dollop of mayonaise. So to summarize – we have no idea, but it seems eating too many eggs is bad, which lies in the realm of common sense, since eating too much of anything is bad.

I have utmost respect for those that attempt to find some aspect of truth in this mess of data, which is absurdly difficult to do. Still, nutritional science fails the time test (truth doesn’t change so quickly over time unless there are some stimuli) and specialization tests (attempts to draw a single conclusion while ignoring all other factors, especially when we have no idea what else changed in a person’s diet).

Milk: There is an ongoing debate about the safety of the consumption of raw milk vs that of pasteurized milk. The debate is simply baseless, as pasteurization was invented only in 1864 and milk began to be widely pasteurized in the United States only in the 1920’s, shortly after World War I. There are numerous articles for and against the consumption of raw milk, but the summary is quite succinct and takes the form of a question.

Would you want to eat or drink a product from any animal that’s diseased?[vi]

Seems rather foolish, but that’s what we do on a daily basis. To ensure our health, we then need to destroy anything and everything in the food. The socio-economic factors causing the spread of pasteurization give a better explanation of why pasteurization is now widespread. After the Selective Service Act in Sept. 1917 [vii], 2.8 million men were drafted in two years. Given that the population of the US was approximately 100 million,[viii]. Given that the labor force was greatly reduced, it follows that it is quite likely that it was much more difficult to take care of livestock. Maintaining cows is quite labour intensive and the quality of the milk is most likely correlated to the health of the cow. But even if we assume that there was enough labor left to take care of the cows, how about the transportation of the milk? Usually, milk had to be delivered fresh because it would go sour after a few days. Still usable in a variety of recipes, but certainly less tasty. Pasteurized milk on the other hand, lasts close to 10 times as long without going putrid, it doesn’t go sour, and is much easier to distribute. On a societal level, in my opinion, there simply was not enough manpower during the wars for the production and distribution of raw milk to continue in a safe manner. In present day, it’s simply not possible to produce quality raw milk on a large scale for a raw dairy farmer would have difficulty taking care of more than a few dozen cows. Comparing this to the modern dairy business which has sometimes up to 5000 cows, one can see that pasteurization isn’t going anywhere. The safety of raw milk itself isn’t in question, but whether one is able to have healthy cows and distribute it safely certainly is. I would hesitate to drink milk from diseased cows (or eat diseased meat) and would certainly advocate pasteurization if I was going to drink milk from a diseased animal.

Back from that digression, the pasteurized vs raw debate fails both the time and the common sense test. There simply hasn’t been enough time gone by to truly know the effects of drinking pasteurized milk, or more aptly put, sterile milk from diseased cows. Man has only been drinking pasteurized milk for around 80 years and the homogenization of milk only started after that, again to aid in the transport of milk and perhaps to hide the lack of cream content by which milk is judged by. However, there are already some notable subtle changes. In old medical texts, milk was recommended as a solution for eczema and women bathed in it to maintain their skin, while in modern medicine, it causes eczema. Unfortunately, there is no good way to test subtle changes in diet over short periods of time.

On the other hand, raw milk has been drank for several thousand years, and is still drank by a sizable population in India and small pockets around the world. That raw milk is inherently unsafe fails the common sense argument. If it was inherently unsafe, one would think that it would have gone the way of aspertame or absestos. There now is a great deal of research being done which supports the raw milk movement. Thankfully, the science is also supported by the commmon sense test and a substantial population that has been consuming it for several thousand years. The safe distribution of raw milk and finding healthy cows can only be done well on a small scale, so that matter is still up in the air.

I’ll briefly mention some of the other fiasco’s of the “food science” movement. Aspertame is now known to be a carcenogen. Nitrates is preserving food with a poison. If food has gone rotten, one should probably not eat it, rather than just masking the smell and taste. To summarize, food scientists have not collected enough data over a long enough period of time to come to any sort of conclusion about anything. Use your common sense.


Modern economics fails the falsifiability criterion and lacks predictive power. One needs to look no further than the thousands contradictory economic predictions we read in the business section or the Economist. Suffice to say, mathematical economics is simply a sham as it doesn’t model reality and other schools of economics are based on assumptions which are no longer true giving rise to a large vacuum in present economic thought which is yet to be filled. Considering that mathematical/analytical economics is the flavour of the day, here lies some of the major problems with mathematical analysis.

Modeling in economics is a pernicious affair as we have to simplify reality to create a model. For instance, the pricing of options is based on the assumption that prices will follow a relatively smooth path either up or down based on the overall volatility of the stock. A Gaussian or bell curve is used to model the probability of the change in prices over time. Unfortunately, for the model, events do happen that are not reflected in a Gaussian distribution. For instance, a company may be releasing a new product line or be announcing possible financial issues. This announcement causes either prices to rise or down, but the actual probability isn’t reflected in the option pricing based on a Gaussian. For instance, if one knew that the UN was working on an agreement with Ecuador to give it 3.6 billion to not drill it’s oil, one would probably expect that oil companies in Ecuador would probably not benefit from this and can exploit that knowledge.
Another problem is that modeling occasionally has no predictive power because of the type of data one is using to build the model with. For instance, if one uses data from 1995-2005 and builds a housing model for that, one would come to the conclusion that housing prices always go up. I’m sure it’s a great model for the years 1995-2005, but it doesn’t really model reality. Depending on what data one uses, one could make the model say pretty much anything. In fact, for every impartial model, there is a rather influencible modeler. Despite the scientific rigor that one assumes for this type of work, the truth is that in every model there are a few variables which can be “tweaked” to give a result the modeler is partial to. As a physicist by training, I’ve certainly tweaked some various dielectric constants to correspond with experimental results, within experimental error of course. Unfortunately, economics and finance doesn’t have any repeatable experimental results to speak of and the human condition is that the models we build have our inherent biases included.

Mathematical models in economics don’t account for the influence of large parties that are able to cause sharp and sudden dips. For instance, if Warren Buffett decided to liquidate all his stock, it would probably cause a large panic and drop in stock prices, after which, he could re-buy everything at much lower prices. Perhaps the most famous use of this was Rothschild during the Battle of Waterloo which allowed him to vastly increase his fortune. Currently, China owns close to 1 trillion dollars of US Treasury Securities. This means that if China stops buying US debt, suddenly financing the US government gets much more expensive. I’m sure that several thousand risk analysts have built a model for this, and I’d bet some money that one of them has it right. Amusingly enough, a model can be built for almost anything, but it may still lack predictive power. Federal Reserve Chairman Ben Bernanke states, “Neither experience nor economic theory clearly indicates the threshold at which government debt begins to endanger prosperity and economic stability.” [x]

Finally, nothing is testable. Look at Paul Krugman’s essay about stagflation and Japan[xi]. Apparently, the entire world is still in an untestable liquidity trap. [xii]. The fact that modern economists are finally in ill-repute after the past half-century shows that common sense is not lacking in the people. Economists cannot predict anything because it’s not possible to model something as complex and irrational as the economy. I’m certainly not the only one who feels that economics has lost it’s way. The esteemed Heilbroner wrote an entire book on the problems with mathematical economics in “The Crisis of Vision in Modern Economic Thought.” To summarize, as far as economics refers to reality, it is not certain, and as far as it is certain, it does not refer to reality.

Pharmaceuticals and Drugs

There has been a transition over the past two thousand years as humans tried to avoid and combat illness. We’ve gone from eating the right foods, to eating the proper herbs, to producing potions and pills with the right ingredients, to creating and consuming the right synthetic drugs. Ironically, we’re now back to eating the right foods.

Modern drugs are indeed one of the crowning achievements of science. That they work at all is a miracle considering the progress that has been made over the past 80 years. Aspirin, the acetylation of salicylic acid found in the curative bark of the willow tree, may be considered the first major success by some. It was well-known that willow-bark tea soothed pain, but it was still quite a step to create that in pill form.

Pharmaceuticals are on a whole different level. I’m not truly qualified to speak on the complexities of the human body with it’s receptors, molecule binding, neuropeptides, and what not. Unfortunately, our knowledge of the human body is so incomplete, I don’t think anyone is qualified to talk about it. For instance, there’s the well-taught membrane-pump theory which is in dispute by Gilbert Ling [xiii]. Again, I’m not qualified to judge the merits of these arguments, and I’m not sure even sure such a person exists.

Modern pharmacology is scientific through and through, but again, there over-application of very specialized results and inadequate testing or data renders some of the results dubious. Note that the drugs do indeed do what they are designed to do. In most cases, it is to bind to some receptor or to flood some site, but it’s the rest of the effects that are in question. Let’s say that binding to receptor X with molecule Y does indeed solve problem A most of the time. How can we possibly know what other side effects there are in doing this? Again, the sampling size for human tests is too small and the duration of the tests too short to be useful. An ironic consequence of the constant improvement and innovation of the pharmaceutical industry is that doctors in the United States need to be re-tested every six months[xiv] for new best practices, ie. new drugs to prescribe. Do you really want to go to a doctor that says, “Well, six months ago, I thought I knew what I was doing, but then I took a test last week, and now I really know what I’m doing because I just found out the drug I gave you last time could have given you a heart attack.” It certainly doesn’t inspire my confidence.

Steroids: The well-known corticosteroid drug Prednisone is used to treat a variety of illnesses from eczema and asthma to cancer. The list of side effects, however, is not minor and varies from person to person: increased blood sugar for diabetics, weight gain, facial swelling, depression, mania, psychosis or other psychiatric symptoms, unusual fatigue or weakness, mental confusion / indecisiveness, blurred vision, abdominal pain, peptic ulcer, infections, painful hips or shoulders, steroid-induced osteoporosis, osteonecrosis, long-term migraines, insomnia, severe joint pain, cataracts or glaucoma, anxiety, black stool, stomach pain or bloating, severe swelling, mouth sores or dry mouth, avascular necrosis, hepatic steatosis, are among some of the problems. Again, these drugs are very good at the specific task that they do, but the question is, “What else do they do?”

If you think that the FDA or other agencies are protecting you against this problem, perhaps you should think again. When an official from the FDA states that something is safe, what it really means is that they have no proof that it’s unsafe. The mandate of the FDA is to make sure that what it promotes is not strictly unsafe, which certainly isn’t the same as ensuring that something is safe, which would require much more stringent levels of data which nothing novel would pass. One only needs to look at a few illustrative examples such as the FDA’s stance on BPA. Dr. Joshua Sharfstein of the FDA said, “If we thought it was unsafe, we would be taking strong regulatory action.” Compare that to the stance of Canada, who banned BPA in 2008, taking a precautionary approach even though the scientific evidence was inconclusive. More serious failures of regulation occurred with thalidomide, from 1957-1961, which, in addition to curing morning sickness, also caused birth defects.

The crux of the problem is that scientific evidence is not conclusive most of the time. Most complexities arise from the problems of dosage, and as Paracelsus, sometimes called the father of toxicology, once said, “All things are poison and nothing is without poison, only the dose permits something not to be poisonous.” There have even been deaths from drinking too much water. A recent example of the dosage complexities would be the melamine scandal, where infant formula was tainted with the substance in China. On Oct. 3, 2008, the FDA stated quite concisely that “cannot establish a level of melamine and its analogues in these products that does not raise public health concerns” [xv] based mostly on the lack of data. The media ran with it and somehow the story turned into “no amount of melamine is safe”. In a further update on Nov, 28, 2008, notably after it was found that US infant formula was also tainted, the FDA then stated, “levels of melamine or one of its analogues alone below 1.0 ppm in infant formula do not raise public health concerns.” [xvi] which doesn’t contradict their previous statement. Looking at the document carefully, it looks like it was drafted by a lawyer considering the exactness of the wording of the toxicity and the ambiguity of of “public health concerns”. In any case, you must be thinking, “On one hand, this guy is saying one shouldn’t trust the FDA, and on the other hand, he’s defending them. He’s an idiot.” What I’m actually doing is pointing out how difficult it is to establish scientific evidence for anything and that one should always err on the side of caution. The FDA definitely fails the time test as they never can gather enough data to judge if something is safe or not, so they must go with the lower standard of showing that something is unsafe. The fact that because one cannot prove that something is unsafe in a short period of time, has no bearing on whether that it is safe over a long period of time. Rather than the mandate of innocent till proven guilty, it’s probably much safer to use guilty until proven innocent.

A classic example of overspecialization causing unintended consequences as specificity of research prevents complete solution. Hermann, the chemist who discovered the insecticide properties of DDT, was awarded the Nobel Prize in 1948. Indeed, DDT is an excellent pesticide, and also destroys a wide-array of wildlife, is linked to diabetes, and is a suspected carcinogen. The consequent launch of the environmental movement is perhaps the only notable positive from this incident. Unfortunately, due to the specialization of science, it wasn’t really Hermann’s concern about what else DDT did, and it took years to figure it out. Not to mention it was also largely useless within a decade due to resistance.

Another critique regarding agriculture comes from “Natural Farming” by Fukuoka, a microbiologist specializing in plant pathology, turned farmer in Japan. He, in a very specific criticism which spans around a hundred pages, outlines the problems in modern scientific thought in agriculture from pesticides to fertilizer to Liebig’s law of the minimum. Some of his points include that attempting to hold only modify only one variable while keeping everything else constant simply doesn’t apply to the real world as one can never recreate the laboratory experiment in nature. The assumption that the effects of adding fertilizer, pesticides, etc. would give a high-yield crop, is not necessarily true. In fact, he contends, “Far from benefitting agricultural productivity, progressive specialization in research actually has the opposite effect. Methods intended to boost productivity lead instead to the devastation of nature, lowering overall productivity.”

Nature is extremely complex and not easily understood. He states, “Specialists in various fields gather together and observe a stalk of rice. The insect disease specialist sees only insect damage, the specialist in plant nutrition considers only the plant’s vigor. This is unavoidable as things are now. As an example, I told the gentleman from the research station when he was investigating the relation between rice leaf-hoppers and spiders in my fields, ‘Professor, since you are researching spiders, you are interested in only one among the many natural predators of the leaf-hopper. This year spiders appeared in great numbers, but last year it was toads. Before that, it was frogs that predominated. There are countless variations.’ It is impossible for specialized research to grasp the role of a single predator at a certain time within the intricacy of insect inter-relationships.” [xvii]

His criticisms of fertilizer are numerous: 1) fertilizers weaken the plants and they have lowered resistance to disease and pests, 2) fertilizer applied to soil is not as effective as in lab experiments, ie. 30% of the nitrogen is denitrified by the microorganisms and only penetrate to the first two inches, 3) fertilizer causes direct and indrect damage including acidifying the soil, 4) a deficiency in trace components occurs when using chemical fertilizer. Modern literature also shows that indirect damages also occurs from fertilizer runoff which causes growth of algae which eventually destroys the local ecosystem and creates “dead zones” [xviii] . And although I recall reading that pesticides also inhibit the production of a natural enzyme in plants, which we incidentally need in small doses also, I do not recall the source.

Another interesting tidbit from his first book is that the usage of pesticides actually occasionally decreases the total yield, a fact, that he himself originally thought was experimental error. On further examination, he found that the pests that attacked the rice actually thinned out the plants, leaving more room for the rest to grow better as sunlight could reach the lower leaves. He states that results which show lower yields, are simply assumed to be experimental error and that half is discarded.

So it turns out that a discovery that gets the Nobel Peace prize, also has disastrous effects for the environment, while an extremely scientific, though complicated method, gets largely ignored. One would also think that widely introducing a technology without sufficient testing of the consequences on a smaller scale is probably not a good idea. Perhaps historically, we were more inclined to think that technological innovations were inherently positive. Right?

Wireless communications:

Cell phones, otherwise known as the greatest mass experiment conducted by humanity, are so new and novel that we still don’t know many of the side effects. One of the possible side effects has been the reduction in the bee population. Data from the US Department of Agriculture show 32 percent fall in 2007, 36 percent in 2008, and a 29 percent drop in beehives in 2009. There are many theories as to why the bees are dying, including, but not limited to varroa mites, parasites, gmo crops, and cell phones. “Recently, a sharp decline has also been noticed in commercial bee population in Kerala posing a serious threat to honey bees…Similar cases have been observed in Bihar, Punjab, Nepal and other parts of India and have been attributed to increasing electro pollution in the environment,” was a short remark in a 50 page report detailing the effects of cell tower radiation. [xix]. Some recent studies indicate that mobile devices disrupt the communication and navigation of bees causing them to get lost [xx] . One should not underestimate the importance of pollinators in the production of food, without which, we would only have around 4 years to live according to Einstein, but at least, short enough that we wouldn’t have to deal with an epidemic of brain cancer.

Which brings me to my second point. If I asked you 20 years ago if you’d like to put a 1 watt radiative antenna next to your head, would you say, “Well, the potassium in the cerebrospinal fluid surrounding the brain would cause an attenuation of those microwaves pretty quick, I’m pretty sure I’m not cooking my brain, so that sounds swell” or would you just say no? The WHO recently classified it to be “possibly carcinogenic to humans”. Considering that it would take 20-30 years to find out if it actually does _definitively_ cause cancer, it doesn’t make much sense to hold one against your ear or near your genital regions for hours at a time. [xxi] [xxii]

After that whirlwind tour of various subjects, one can see that conclusive scientific results are very difficult to obtain, and given the rate of technological progress, we have no way to ascertain what is safe or dangerous. We should have some skepticism towards the results of science, rather than treating it on almost religious terms. New technologies are not inherently good. Studies are sometimes incorrect, have not enough data, are badly done, and as such, we certainly shouldn’t treat anything out of the mouth of a scientist as gospel. We should always ask ourselves, “Is what this person telling me actually make sense?” Questions like, “Why am I eating diseased livestock?” or “Why am I putting a one-watt antenna next to my head?” or “What other effect does this drug/technology/etc have that they haven’t investigated yet?”

It has been aptly said that half of what we know is right, it’s just that we don’t know which half. Until we know which half is which, we should have some common sense and err on the side of caution.

Unfortunately, this post barely covers examples of insensible science that are benign in nature. We still need to cover actual malicious and deceptive science, ie. “Who is making money from this?”

0 Flares Twitter 0 Facebook 0 0 Flares ×
  1. []
  2. []
  3. []
  4. []
  5. []
  6. comic about diseased livestock []
  7. []
  8. need to find out proper percentage of able-bodied men that were taken out of the labor force []
  9. And other examples of this include but are not limited to A/B milk strains new zealand, the effect of milk production on cows from electrical ground, gmo crops, feeding cows corn and then irradiating the meat to get rid of acid resistant e-coli, etc, etc. []
  10. []
  11. []
  12. []
  13. []
  14. check with Malar []
  15. []
  16. []
  17. One Straw Revolution []
  18. []
  19. Report on cell tower radiataion Girish Kumar – 2010 – IIT Bombay []
  20. Sharma, V.P. and Kumar, N. K. 2010. Changes in honeybee behaviour and biology under the influence of cellphone radiations. Current Science 98 (10): 1376 – 1378] [Mobile Phones and vanishing bees – MW Ho – Science in Society, 2007 – []
  21. – Mobile phones and malignant melanoma of the eye – British Journal of Cancer (2002) 86, 348–349. doi:10.1038/sj.bjc.6600068 Published online 1 February 2002 []
  22. Nerve cell damage in mammalian brain after exposure to microwaves from GSM mobile phones Leif G Salford, Arne E Brun, Jacob L Eberhardt, Lars Malmgren, and Bertil R R Persson []
0 Flares Twitter 0 Facebook 0 0 Flares ×