The place was Edinburgh, Scotland. The occasion, the Edinburgh Science Festival. There were a number of captivating presentations, but my biggest thrill came from looking out the hotel window. A light rail track was being constructed just outside and the workers were busy welding. My eyes popped when I saw what they were doing. I was looking at a live thermite reaction! I had talked about this reaction in class on numerous occasions and marvelled on it in videos, but had always deemed it too dangerous to perform.
A chemical reaction that produces heat is said to be “exothermic.” The most common example would be the combustion of a fuel. Light a candle and you can feel the heat that is produced. The hottest part of a flame, where the colour is a light blue, can reach a temperature of about 1400 degrees Celsius. But that is a low temperature compared to the 2500 degrees produced by the “thermite” reaction between aluminum and iron oxide. Essentially, this reaction involves the transfer of oxygen from the iron oxide to aluminum to yield aluminum oxide and metallic iron. At this high temperature, the iron is in its molten form and sets fire to any combustible material in its path, making the thermite reaction ideal for use not only in welding, but also in incendiary bombs and grenades.
Back in 1893, German chemist Hans Goldschmidt was looking for a way to produce pure metals from their ores. The classic method for extracting iron relies on heating iron oxide ore with carbon. The carbon is converted to carbon dioxide as it strips oxygen from the iron, leaving behind metallic iron. Some unreacted carbon, however, tends to contaminate the iron. Goldschmidt was looking for a way to produce iron without the use of carbon and hit upon the reaction of iron oxide with aluminum. He was impressed by the remarkable amount of heat produced and suggested that the reaction he had discovered could be used for welding. In 1899, the thermite reaction was put to a commercial use for the first time, welding tram tracks in the city of Essen.
It didn’t take long for the military to realize the potential of this extreme exothermic reaction in warfare. In 1915, the Germans terrorized England by using Zeppelins to drop incendiary bombs based on the thermite reaction. By the Second World War, the battle was on not only between Allied and German armed forces, but also between their scientists and engineers who sought to produce more effective incendiary devices. The Germans came up with the “Elektron” bomb, named after Elektron, an alloy composed of 86 per cent magnesium, 13 per cent aluminum and 1 per cent copper that was used for the casing of the bomb.
This alloy burns with a very hot flame, but requires a high temperature for ignition. The thermite reaction was up to the task. When an Elektron bomb hit the ground, a small percussion charge of gunpowder ignited a priming mixture of finely powdered magnesium and barium peroxide. This reaction produced the heat needed to ignite the thermite mix of aluminum and iron oxide, which in turn ignited the highly combustible casing. The Allies developed similar types of bombs resulting in the most destructive air raid in history, which was not Hiroshima or Nagasaki, but the firebomb raid on Tokyo in March 1945. An Allied bombing of Dresden the same year with incendiary bombs virtually destroyed the whole city. During the Second World War, the Allies dropped some 30 million 4-pound thermite bombs on Germany and another 10 million on Japan.
Thermite hand grenades were also used during the war to disable artillery pieces without the need for an explosive charge, very useful when silence was necessary to an operation. This involved inserting a thermite grenade into the breech of a weapon and then quickly closing it. The great heat produced by the thermite reaction welded the breech shut and made loading the weapon impossible. Alternatively, a thermite grenade was discharged inside the barrel of an artillery piece making it useless.
During the Vietnam war, thermite grenades found a different use. From the start of hostilities, putting a crimp into the enemy’s food supply was part of the U.S. military strategy. Since rice was a staple for the Viet Cong, destroying rice paddies was a primary goal. At first, attempts were made to blow up rice stocks and destroy paddies with hand grenades and mortars, but this proved to be maddeningly difficult. The next idea was to burn the rice paddies with thermite grenades. All this did was scatter the rice grains, which could then still be harvested. Another approach was needed.
Enter “Agent Blue,” an arsenic-based herbicide, unrelated chemically to the more infamous Agent Orange. Agent Blue affects plants by causing them to dry out, and as rice is highly dependent on water, spraying Agent Blue on rice paddies can destroy an entire field and leave it unsuitable for further planting. The U.S. used some 20 million gallons of Agent Blue during the Vietnam war, destroying thousands of acres of agricultural fields and defoliating wooded areas that the Viet Cong used to ambush American troops.
Recently, the thermite reaction made the news in a different context. Conspiracy theorists purport that it was thermite explosives planted inside the World Trade Center that brought down the twin towers in a CIA coordinated plot. They also maintain that the moon landing was faked and that the U.S. government is hiding the bodies of aliens. Some also claim that the rise of Donald Trump was engineered by a Democratic conspiracy and that on the verge of being elected he will announce “fooled you.” Wouldn’t that be something? It would trump the thermite reaction for heat generated.Read more
During a recent talk on the relation between the body and the mind, I mentioned the newest anxiety-relieving craze, colouring books. Aimed at adults, these feature intricate patterns that provide quite a challenge for staying inside the lines. The contention is that focusing on the special patterns distracts the mind from anxiety and stress. Evidence is sketchy, but millions of colouring books are flying off the shelves, topping best-seller lists. That in itself says something about our society.
After my talk I was approached by a lady who claimed she had something better than colouring books to relieve anxiety and slipped a vial full of pills into my hand. She didn’t seem like a clandestine drug pusher so I thought I would look down and find some pills of lorezapam or maybe St. John’s Wort. Such was not the case. The label on the vial read “Arsenicum album 30C.”
No, she was not trying to poison me. These were homeopathic arsenic pills based on the curious notion that a substance that in large doses causes certain symptoms can, in homeopathic potency, repel the same symptoms. Since arsenic poisoning is associated with anxiety and restlessness, a person suffering such symptoms should find relief in a homeopathic dose of arsenic. In the bizarre world of homeopathy, potency increases with greater dilution, and a dose of 30C is said to be extremely potent. Such a pill is made by sequentially diluting a solution of arsenic a hundred fold thirty times and then impregnating a sugar pill with a drop of the final solution. At a dilution of 30C, not only is there no trace of arsenic left, there isn’t even a water molecule that has ever encountered any of the original arsenic.
Homeopathy is a scientifically bankrupt practice that was invented over two hundred years ago by German physician Samuel Hahnemann who was disenchanted with bloodletting and purging, common medical procedures at the time. He was a good man who searched for kinder and gentler treatments and homeopathy fit that rubric. Since knowledge of molecules was almost non-existent at the time, Hahnemann could not have realized that his diluted solutions contained nothing. Actually, the truth is that they did contain something. A hefty dose of placebo!
Now here is the kicker to this story. Hahnemann was quite accomplished in chemistry and actually developed the first chemical test for arsenic. In 1787 he found that arsenic in an unknown sample was converted to an insoluble yellow precipitate of arsenic trisulfide on treatment with hydrogen sulfide gas. When in 1832 John Bodle in England was accused of poisoning his grandfather by putting arsenic in his coffee, John Marsh, a chemist at the Royal Arsenal, was asked to test a sample of the coffee. While he was able to detect arsenic in the coffee using Hahnemann’s test, the experiment could not be reproduced to the satisfaction of the jury and Bodle was acquitted. Knowing that he could not be tried for the same crime again, he later admitted to killing his grandfather.
The confession infuriated Marsh and motivated him to develop a better test for arsenic. By 1836 he had discovered that treating a sample of body fluid or tissue with zinc and an acid converted any arsenic to arsine gas, AsH3, which could then be passed through a flame to yield metallic arsenic and water. The arsenic would then form a silvery-black deposit on a cold ceramic bowl held in the jet of the flame and the amount of arsenic in the original sample could be determined by comparing the intensity of the deposit with that produced with known amounts of arsenic.
The Marsh test received a great deal of publicity in 1840 when Marie LaFarge in France was accused of murdering her husband by putting arsenic into his food. Marie was known to have bought arsenic from a local chemist which she claimed was to kill rats that had infested the house. A maid swore that she has seen her mistress pour a white powder into her husband’s drink and Marie had also sent a cake to her husband who was travelling on business just prior to his becoming ill. The dead husband’s family suspected that Marie had poisoned him and somehow got hold of remnants of food to which she had supposedly added arsenic. The Marsh test revealed the presence of arsenic in the food and in a sample of egg nog, but when the victim’s body was exhumed the investigating chemist was unable to detect arsenic.
To help prove Marie’s innocence by corroborating the results of the investigation of the exhumed body, the defense enlisted Mathieu Orfila, a chemist acknowledged to be an authority on the Marsh test. Much to the defense’s chagrin, Orfila showed that the test had been carried out incorrectly and used the Marsh test to conclusively prove the presence of arsenic in LaFarge’s exhumed body. Marie was found guilty and sentenced to life in prison. The controversial case captured the imagination of the public and was closely followed through newspaper accounts making Marie LeFarge into a celebrity. It would also go down in the annals of history as the first case in which a conviction was secured based on direct forensic toxicological evidence. Because of Mathieu Orfila’s role in the case, he is often deemed to be the “founder of the science of toxicology.” The Marsh test became the subject of everyday conversations and even became a popular demonstration at fairgrounds and in public lectures. This had an interesting spin off. Poisonings by arsenic decreased significantly since the existence of a proven, reliable test served as a deterrent.
As far as claims about relieving anxiety with homeopathic arsenic go, well, they cause me anxiety. I think I’ll flush those homeopathic tablets down the drain (no worry about arsenic pollution here) and buy a colouring book.Read more
“We've had more people reverse cancer than any institute in the history of health care, so when McGill fails, or Toronto hospitals fail, they come to us. It can be stage 4 cancer and we reverse it.” You can imagine why that quote caught my eye. Both McGill and University of Toronto have world-class cancer treatment centers, but unfortunately, when it comes to stage 4 cancers, which are the most deadly, the chance of successful treatment is low. So, who is it that claims success where the latest evidence-based treatments fail? “Dr.” Brian Clement, who runs the Hippocrates Health Institute in Florida, apparently has the answers that have evaded mainstream researchers. What sort of doctor is this fellow? One who has some sort of accreditation as a “nutritionist” from a diploma mill where they apparently teach some, let us say, “interesting” science. I’m judging by the following rather fascinating outpouring of nonsense-bedecked drivel from the Hippocrates Health Institute.
“Based on modern biophysics and ancient Chinese medicine, color frequencies are applied to acupuncture points using a light pen and crystal rods. This promotes hormonal balance, detoxification, lymph flow and immune support while reducing headaches and sleeplessness. Working on cellular memory where the cause of disease resides, color puncture promotes healing from within.” And all you have to do is shell out $120 for a 50 minute treatment. All this of course is laughable, but when it comes to claims about curing cancer, the humour quickly vanishes with the realization that it is real people with real cancer who are being duped. And going by the following asinine promo, that is just what is happening.
“One of the major treatment goals of The Cancer Wellness Program at Hippocrates Health Institute is to strengthen the basic vitality, flow, and coherency of a person’s BioEnergy Field upstream to affect and change their downstream physical mass. The changes in a person’s vibrational frequency or bioenergy field, once stabilized, changes the electrical/chemical milieu in their body so that it is more difficult for their cancer or tumor mass with its own specific vibrational frequency to be sustained.”.
This is inane claptrap is far from the only type of cancer treatment Hippocrates offers. Intravenous vitamins and wheat grass implants are standard fare. Implanted where? Well, let’s just say in areas where the sun doesn’t shine. Clement maintains that “every disease known to man, plus premature aging, can be successfully dealt with on a diet of organic plant based foods.” Apparently not mental disease, given that Clement surely follows this diet. Patients are also told to give up meat and dairy, and are asked to swallow some rather bizarre ideas. Genetics don’t matter much, Clement says, and what doctors say about the BRCA gene predisposing to breast cancer is false. On his regimen, this mental wizard claims, tens of thousands of people have reversed the final stages of cancer. I would love to see the evidence for that. This charlatan is in Canada right now, giving talks, mostly to entice First Nations people to visit his Institute in Florida for treatment. Just like that given to the unfortunate 11-year-old Ontario girl who suffered from leukemia. That had a very sad outcome. Let’s just say she was not one of the tens of thousands of patients that Clement claims to have successfully treated.Read more
Sometimes you can evaluate a person’s scientific acumen by a single comment they utter. This is the case with Catherine Sugrue who labels herself a “holistic nutritionist rockstar.” Of course suspicion about her knowledge is immediately raised when we learn that it was gained at the “Institute of Holistic Nutrition,” which isn’t exactly Harvard. But the giveaway of the rockstar’s untrustworthiness is her reiteration of the absurd statement that “margarine is about one molecule away from plastic.” This isn’t about coming to the rescue of margarine. I don’t like it and I don’t eat it. I much prefer butter. But I am piqued by the shoddy pseudo scientific exhortations of self-proclaimed experts. In this case I’m further annoyed that this particular pseudoexpert was interviewed for an article about fats that appeared not in the National Enquirer, but in the National Post. When there are Canadians like Yoni Freedhoff, Chris Labos and Tim Caulfield who actually are experts when it comes to nutritional issues and would never confuse the public with ludicrous analogies between margarine and plastic.
Margarine being “one molecule away from plastic” is just plain nonsense. Plastics are composed polymers while margarine is a blend of fats and water. There is no chemical similarity between the two. In any case, being “one molecule away” is a totally meaningless expression. Substances are made of molecules, which in turn are composed of atoms joined together is a specific pattern. I suppose one might say that hydrogen peroxide, H2O2, is one atom away from water, H2O, but even this is meaningless. That extra oxygen atom changes the properties of the substance dramatically. Sticking a finger into a bottle of pure hydrogen peroxide quickly reveals the effect of that extra oxygen.
So, even if margarine had some chemical similarity to plastic, which it does not, its properties could still be dramatically different. Slight alterations in molecular structure can account for very significant changes in properties.
It is true that saturated fats have been vilified beyond the scientific evidence but the pendulum is swinging too far in the other direction. Kourtney Kardashian attributing her 5 pound weight loss to drinking clarified butter every morning is without scientific merit. Catherine Sugrue correctly warns that “getting your nutritional advice from celebrities is a dangerous game.” But so is getting it from a self-proclaimed “holistic nutritional rockstar” who is a graduate of an institution where you can take continuing education courses in “energy medicine,” “clinical detoxification,” and “applied iridology.”Read more
We have become familiar with the routine at airports. Your carry-on bags are passed through an x-ray machine after which an officer will often wipe your bag with a piece of fabric which is then placed inside a box-like instrument. Within a few seconds you get the all-clear signal and you are on your way to the gate. How many travelers get handcuffs instead of an all-clear isn’t known because those stats are not released. What do these instruments actually do? When luggage is bombarded with x-rays, some of the rays pass through and some do not, depending on what they encounter. The more dense a material is, the less transparent it is to x-rays. Lead, for example, prevents any x-rays from passing through. To put it simply, the intensity of the x-rays that have passed through the luggage is a measure of the density of the substances contained in the luggage. Different substances will have unique densities and the densities of various explosives have been determined. The x-ray machine then compares the densities detected by the passage of x-rays to the predetermined densities of a host of suspect substances.
The instrument that analyzes the swabs is an “ion mobility spectrometer.” When the swab is inserted, a gust of a carrier gas dislodges some of the molecules that have been collected from the luggage. These molecules are then subjected to bombardment by electrons, commonly from a Nickel-63 isotope source. The bombardment creates ions that are swept through a tube where they are subjected to an electric field resulting in a separation by mass, size and shape of the molecules. These ions are detected and compared with those produced by known samples.
The technology is extremely sensitive and can detect trace amounts of explosives. It is not dependent on having nitrogen in the sample, an element found in almost all explosives except in triacetone triperoxide (TATP). This is what was used in the Belgian and London bombings. TATP is often the choice of terrorists because it is easy to manufacture from acetone, hydrogen peroxide and an acid, all of which are readily available. Of course an explosive in luggage can only be detected if the luggage is inspected. But in the case of the Belgian airport bombing, the explosive was set off in the pre-screening area. To try to counter this, hand held detectors have been developed for use by officers who patrol all areas. When gaseous TATP molecules enter this sensor, they encounter an acid catalyst that converts TATP back into its constituent parts, acetone and hydrogen peroxide. The peroxide then reacts with dyes in the instrument causing them to change colour. By detecting these colour changes, the highly sensitive portable scanner can detect fewer than two parts per billion of TATP. But unfortunately no matter how clever the detector chemistry, it can’t foil all terrorists’ attempts.Read more
A sensational sounding e-mail about “exploding coffee” has been making the rounds. It describes the misadventures of an unfortunate soul who heated up water for coffee in a microwave oven. When he picked up the mug, it “exploded!”
Explode is probably too strong a term, but spurting and frothing is a real possibility. This is due to a phenomenon known as superheating. First, we have to understand what boiling is all about. At the surface of a liquid molecules are always evaporating. If we leave a glass of water out, it will eventually disappear. If we heat the liquid, its molecules move faster, become more energetic and more molecules go into the vapour phase. As a consequence, the liquid disappears more quickly. At the boiling point, molecules all over the liquid, not only at the surface are energetic enough to go into the vapour phase. They do this most readily by evaporating into airspaces that exist in the container. All containers have imperfections where air gets entrapped when a liquid is introduced. As these air pockets fill with vapour, they expand and begin to rise. That is why we see streams of bubbles which originate at the sides or the bottom of the container.
In a microwave oven, the container is not heated, only the water. So the container actually cools the liquid in contact with it, meaning that the liquid in the center is always hotter, sometimes by as much as 10 degrees C. But the liquid in the center cannot boil, because there are no air bubbles for it to evaporate into. By the time the liquid near the edge of the container reaches the boiling point, the liquid in the middle is considerably hotter; it is superheated.
The addition of sugar or a tea bag now can spur vigorous boiling. This is because the surface imperfections introduce trapped air bubbles into which the superheated liquid vaporizes. Sometimes just picking up the container can have an explosive effect as the superheated liquid comes into contact with air bubbles on the periphery. Accidents can be prevented by putting a plastic spoon into the mug or glass while it is heating in the microwave. In this case the scare-mongering note about “exploding coffee” may actually has some basis in fact.Read more
What makes people defend the indefensible? A prime example of this conundrum is the case of Antonella Carpenter, a 71 year old “alternative practitioner” who was convicted of conducting a fraudulent scheme to cure cancer in Tulsa, Oklahoma and is likely looking at spending the rest of her life in prison. She is not a physician but has some training in physics and claims that she can cure cancer by injecting a tumour with a saline solution of food colouring and walnut hull extract followed by heating the area with a laser. She calls her treatment “Light Induced Enhanced Selective Hyperthermia,” for which she claims 100% efficacy without any side effects. Any claim of 100% efficacy is a hallmark of quackery since no drug of any kind works in such a foolproof fashion. Even worse, she sometimes told patients they had been cured. As is often the case, quacks unearth some legitimate process and then twist it out of proportion to hatch a money-making scheme.
In this case, the legitimate process is “photodynamic therapy.” The treatment of cancer involves some process by which cancer cells are destroyed while normal cells suffer less damage. Unfortunately, it isn’t possible to avoid collateral damage completely and cancer treatment via radiation or drugs is always burdened with side effects. In photodynamic therapy the idea is to introduce a chemical, known as a photosensitizer, that when activated by light interacts with oxygen to convert it into a very reactive form known as “singlet oxygen” that then attacks any organic compound it encounters with the result being cell death. The photosensitizer can be introduced intravenously followed by the tumour being exposed to long wavelength light via optical fiber. Alternately, the photosensitizer can be injected directly into a tumour and then the area exposed to light. In either case singlet oxygen is produced only within the tumour, minimizing damage to normal tissue. The process is applicable to certain types of tumours and is certainly not a cure-all for cancer.
It is this therapy that has been mentally mangled by Antonella Carpenter, who according to investigators cheated cancer patients out of their money and gave them false hope. In spite of any evidence of her treatment having any efficacy, supporters have sprung to her side, claiming that she was wrongly convicted by a kangaroo court. Here are some of the phrases they are pumping out: “The greedy and vindictive genocidal maggots who control the Cancer Industry and have the FDA and courts in their back pocket”…. “the medical mafia is hard at work twisting the truth and vilifying Dr. Antonella Carpenter and any other non-Allopathic practitioners and natural or alternative treatments as quackery”…. “Dr. Carpenter was vindictively targeted by the Medical Mafia and their Gestapo goons at the FDA for successfully curing dozens of cancer patients.” No. The truth is that she was targeted for subjecting cancer patients to a treatment that had no chance of working and was claiming she had cured them. That is evil.Read more
Picture this. You swallow a little pill, wait until it irritates your intestines enough to expel its contents and then hunt through the expelled excrement to retrieve the pill. Why? So you can use it next time to get rid of the bad humours in your body that are making you sick. How can a pill survive passage through the digestive tract? It can, if it is made of metal, in this case, antimony. Now, don’t go asking the pharmacist for antimony pills. The scenario just described isn’t current, it was plucked out of the Middle Ages when the cure for disease was to expel “bad humours” from the body. Actually, that was not unlike the current craze of expelling unnamed toxins from the body with a variety of “cleanses,” many of which have a laxative effect.
Hopefully nobody today would be silly enough to use antimony or its compounds, because here we are talking about real toxicity. Of course they didn’t realize that in the Middle Ages; all they knew was that antimony was pretty good at evacuating the body. And not only through the rear portals. One method involved drinking wine that had been left standing overnight in a cup made of antimony. This resulted in the antimony reacting with tartaric acid in the wine to form antimony tartrate, a compound that induces vomiting. The idea of purging the body to treat illness persisted into the late stages of the 18th century. When Mozart came down with a mysterious illness, he was treated with “tartar emetic,” as antimony tartrate was commonly called. What ailment he suffered from isn’t clear, but he died within two weeks. His symptoms of intense vomiting, fever, swollen abdomen and swollen limbs are consistent with antimony poisoning. Of course, we cannot prove that antimony was responsible for Mozart’s death, he also suffered from rheumatic fever since childhood, a condition that may have led to his demise at a young age.
Mozart had always been sickly and it is well known that he had been often treated with antimony compounds by his physicians and that he even dosed himself when he didn’t feel well. It is interesting that Mozart actually believed he was being poisoned, but not by himself. He thought his musical rival Antonio Salieri was trying to do him in. Although the famous movie “Amadeus” alludes to this possibility, historical facts do not corroborate the poisoning story. Contrary to the portrayal, Salieri did not confess at the end of his life to having tried to kill Mozart.
Back in the 1990s a volatile compound of antimony known as stibine (SbH3) was accused of being responsible for crib death. The theory was that it was produced from antimony oxide added as a flame retardant to polyvinylchloride sheets. A fungus found in mattresses supposedly made this conversion possible, at least under laboratory conditions. The theory has now been dismissed because neither the fungus, nor levels of antimony in babies’ blood could be correlated with crib death.
More recently Greenpeace created a stir with a booklet entitled “A Little Story About The Monsters In Your Closet.” What sort of “monsters?” The subtitle brings them out of the closet: “Study finds hazardous chemicals in children’s clothing.” Yup, the monsters are chemicals. One that the Greenpeace study detected was antimony trioxide, present in all fabrics that have polyester as a component. No great surprise here since antimony trioxide is used as a catalyst in the production of polyester as well as a flame retardant. And it is true that antimony trioxide can be described as presenting a hazard. But hazard is not the same as risk.
Hazard is the innate potential of a substance to cause harm without taking into account extent or type of exposure. Inhalation of antimony compounds in an occupational setting can be a problem, and it is correct that antimony trioxide has been classified as “suspected of causing cancer via inhalation.” But this is not relevant for the trace amounts found in fabrics. Here the issue would be migration out of the fabric and subsequent absorption. This has been extensively investigated and the amounts that are encountered are well below the established migration limits. The same applies to the trace amounts that leach out of the polyester bottles that are widely used for water and other beverages. Concentrations are less than the 5 parts per billion safety limit.
Antimony does not occur in nature in its metallic form, so where did Middle Age physicians get it? Like most metals, antimony has to be smelted from its ore, in this case antimony sulfide, also known as stibnite, a substance that has been known for thousands of years. Jezebel, the Biblical temptress is said to have used it to darken her eyebrows and stibnite was the main ingredient in “kohl” used by ancient Egyptian women in a type of mascara. Exactly who figured out that heating antimony sulfide converts it to antimony oxide, which yields metallic antimony when fired with carbon, is unknown, but if you visit the Louvre, you can see a 5000 year old vase that is made of almost pure antimony.
Today, neither metallic antimony nor its compounds have a medical use, although up to the 1970s, antimony compounds were used to treat parasitic infections like schistosomiasis. These preparations did kill the parasites, but sometimes they also dispatched the patient. Up to the early twentieth century, tartar emetic was used as a remedy, albeit an ineffective one, for alcohol abuse. The New England Journal of Medicine once reported a case of a man whose wife tried to cure him of his alcoholic habit by secretly putting tartar emetic into his orange juice. The result was a trip to the hospital with chest pains and liver toxicity. Two years later the man reported complete abstinence from alcohol. Seems antimony had taught him a lesson.Read more
Recently, there’s been an influx of media attention on guts. More specifically, the microbes that live in your gut. Extensive research is being done on these little guys as they seem to be having a real impact on our health. These gut microbes may be miniscule but their function is major. And I learnt all about them at “The Secret World Inside You” exhibit now on display at the American Museum of Natural History in New York.
Before I begin walking you through the exhibit, first a brief explanation as to what microbes even are. Microbes are microscopic living organisms that can only be seen with the help of a microscope. And they are everywhere – in every fold and lining of our bodies, including our inside. They literally govern the world inside us and are responsible for much of how we function.
Our skin is the first point of contact for microbes, which is most probably why it’s the first section you get to in this exhibit. There is not one individual whose microbiome is like that of another. However, what came as a real shocker was the fact that people living together – families, roommates, and yup, pets too –share certain microbe make-up. So much so, that when one person leaves the nest for a few days, the microbiome of the house shifts until they return home again. Pretty sweet, no? Everyone sharing the same types of microbes…(It could also be slightly gross if you think about it too much, so just don’t). It was also pointed out how certain microbes, as distant as they may seem, are actually closely linked. Let’s take cheese, for example. The holes in Swiss cheese are made from a bacterium that is similar to one located on the skin, which is why (some) feet take on a cheesy-like smell. On feet, the Brevibacterium linens bacteria converts amino acids into smelly sweat, but in the world of dairy, it serves to ripen Limburger cheese. Delicious? Depends.
Now perhaps it’s my age and the fact that my ovaries now twitch on a regular basis thanks to all the babies on my FB feed, but the next section of the exhibit was hands down my favourite. “Before Birth”, the world of the baby and the microbiome of the mama. Now one would think that the two are inextricably linked since the fetus is totally reliant upon the mother; however, to my surprise, the mother’s microbiome does not mix with the fetus at all. In fact, if the microbiome of the mother interacts at all with the fetus, it could be very risky. And it’s thanks to the placenta, the gatekeeper in this whole process, why the two don’t mix. After visiting this exhibit, I really developed a whole newfound respect for the placenta since it serves a pivotal function, allowing nutrients and oxygen to enter the amniotic sac and preventing any other materials from doing so.
Now once a woman’s water breaks all rules are off. The baby is now cooked enough to not only mingle with the microbes of its’ mother but to start developing a microbiome of their own. And the birth canal is where this all happens. When the baby travels through the canal, the mother’s microbes get pressed into the skin, nose and eyes, and even swallowed by the little one before being delivered to the baby’s gut where they can then start their own gut microbiome. This process is crucial in the development of a baby’s healthy immune and digestive system. (How awesome!) But you may be wondering (as was I), about those C-section deliveries since these babies do not go through the birth canal picking up the mother’s microbes along the way. Instead, these babies pick up microbes from the doctor’s hands and the environment. They end up lining the baby’s digestive tract and in turn have an impact on their immune system, causing C-section babies to be at a higher risk of a variety of conditions, such as asthma and allergies. To test this, studies are now being done where the baby, immediately post-C-section delivery is slathered with a gauze pad that soaked up the microbes in their mother’s birth canal right before birth. Time will tell whether this can benefit the baby but most signs point to yes, which is good news since about one mother in three now gives birth by C-section in the United States.
As life goes on, microbes live, grow and multiply based on what we feed them. Meaning, the food we eat and the choices we make influence our gut bacteria. This has spawned a huge new area of research looking at individual variation when it comes to weight gain and loss, which was another section of the exhibit that I found fascinating, since like the majority of people on the planet I have a few pounds that just won’t relent.
Different people react to different foods in different ways. This is not a novel idea. I mean, just look at allergies and adverse food reactions. Some people have them, some people don’t. But what if this can be attributed to the type of microbes living in your gut? Let’s take a “healthy” food like a tomato, for example. Could you imagine if someone’s blood sugar spiked after eating tomatoes the same way it would after eating a donut? And research has shown, that this is the case! And yet in another individual, tomatoes can have zero spike effect. This whole new line of research could be a breakthrough in terms of weight control. Costly, but important. I know I’d be among the first to sign up to find out just what type of bacteria I have going on in my gut. Of course, as the exhibit suggests, one cannot know whether obese people are obese due to their microbiome or if there are external factors that caused their microbiome to be as such in the first place. It’s the chicken or the egg debate and we shall leave it to science to continue the research.
After leaving the exhibit, I realized that the microbiome is truly a hotbed of scientific research. We know so much but at the same time there are so many question marks about how we can use, manipulate, and alter our microbiome to enhance our health. And I am confident that science will, at one point or another, provide us with these answers; but until then, I’m just going to hope that my gut bacteria interact favourably with tomatoes.
You can visit “The Secret World Inside You” exhibit at the American Natural History Museum in New York where it will be on display until August 2016.
Laboratory workers had long been plagued by sooty, hard-to-control flames and Bunsen of course knew that oxygen was necessary for combustion and that soot was the product of incomplete combustion. He therefore concluded that the secret to a clean flame lay in mixing the combustible gas with air in just the right proportion.
The prototype Bunsen burner consisted of a metal tube with strategically drilled holes through which air could enter and mix with the combustible gas flowing through the tube. A sliding metal cover allowed the operator to vary the number of open holes and thus control the character of the flame. Bunsen, however, never patented his invention. He did not believe that scientists should profit financially from their work; research was to be done for its own sake. Why was Bunsen so interested in developing a clean flame? Because he had a passion for studying the diverse brilliant colors produced by sprinkling various substances into a fire. He had noted that throwing sodium chloride (ordinary salt) into a flame always resulted in a bright orange-yellow glow. The same color appeared if sodium bromide, or indeed any compound of sodium was cast into the flame. Other elements also produced characteristic colors. In fact Bunsen discovered the existence of the elements rubidium and cesium through the colors they produced.
Over a hundred years earlier, Newton had shown how a prism can be used to separate white light into the colors of the rainbow. Bunsen now applied this principle to separate the colors of a flame into their individual components. The spectroscope, an instrument he developed together with the physicist Kirchoff, allowed unknown substances to be identified purely by the colors they produced when heated in the flame of a Bunsen burner.
So, who cares what colors are produced in a flame? Well, just think of the glorious colors of fireworks. Or the bright red strontium flame of an emergency roadside flare. Or the yellow glow of a sodium vapor highway light. The original studies that led to these applications were painstakingly carried out by Robert Bunsen. After having long toiled with flames and spectroscopes in the laboratory, the great man spent years writing up his work for publication. The day the manuscript was finished, he left it on his desk and went out to celebrate. When he returned, Bunsen was horrified to see a smoldering pile of ashes where his treasured treatise had been.
A flask filled with water had been next to the papers and had acted as a magnifying glass, focussing the sun's rays and igniting the manuscript. A lesser man would have surrendered to fate at this point. But Bunsen, even at an advanced age, doggedly repeated the work and eventually published the results of his spectroscopic research so that all the world finally became aware of his burner.Read more
Last week, we heard in the news about a shocking admission from the execs at the National Football League in the USA that they may finally admit to believing that there is a link between football-related head trauma and chronic traumatic encephalopathy or CTE, a degenerative brain disease that is well known to be caused by repeated trauma to the head, such as from concussions. The reason that I find that news headline to be shocking is not because there was an admission of the link, but rather because the NFL was able to deny the existence of that link up until now by simply choosing not to believe that one was there.
A little closer to home, the news over the past few days has contained stories of email conversations among the top brass at the NHL that would show them debating the merits of fighting in hockey by exchanging their beliefs over whether or not repeated concussions during one’s life may lead to mental illness, brain injury or addictions later on. As if these corporate jocks have any knowledge of the medical information required to understand this concept well enough to be qualified to decide upon it. When the medical experts have made pronouncements on this issue, they unambiguously say that getting punched in the head repeatedly over one’s career is very likely to cause brain damage of one kind or another.
How is it that scientific specialists, like the medical researchers in these previous examples, are so easily discounted as being irrelevant in the face of someone else’s belief in something else? This ability to whimsically discount science in favour of a more convenient belief is not restricted to sporting organizations. In fact, we well know that it can be observed at a larger scale, both geographically and destructively speaking, in the anti-scientific dismissal of climate change. How many times have you heard of an online poll or a talk-show pundit that asks whether or not we should believe in climate change? Unfortunately for those pollsters, brain injuries and climate change are real, whether you believe in them or not.
The thing is that the medical knowledge of brain injuries or the vast expanses of knowledge on the earth and its climate are not based on beliefs at all, but rather on thousands upon thousands of accumulated and inter-supportive facts that have not been able to be falsified. These notions form the basis of the scientific method, which is simply a process of seeking to observe and describe the universe and to explain its properties and behaviour. Incidentally, it is also the most reliable and robust tool that we have found to date that allows us to work out fact from fiction in the natural world.
One of the most pervasive problems in society today is the mistaken equivalence of a specialist’s knowledge and understanding of facts with a non-specialist’s beliefs in something else. Belief and knowledge are incompatible with one another and only one of those two is a reliable way of understanding truth. ‘So what?’, we may ask. Perhaps, I may be seen to be an academic fuddy-duddy arguing about semantics and that we should just let people live and let live, each with their own thoughts and beliefs. Well no, I say. It is a problem that has real consequences on people, such as those ex-hockey enforcers dying from brain damage or the millions of people already being affected by runaway global warming.
On a very basic level, this important issue often hinges on the flawed equivalence of the nature of beliefs vs. knowledge in society. These two terms are so dissimilar to one another that they are more like opposites than synonyms. This acknowledgement alone would go a long way to guiding society progress towards an improved health, safety and prosperity of its people. As a starting point, it may be helpful to examine the meaning and usefulness of each of these two words as concepts.
Belief is the acceptance that something is true through the acts of trust, faith or confidence. There is no requirement for evidence in order to believe in something, making it a useful option for simplifying otherwise difficult concepts to absurdity or for dismissing inconvenient truths to irrelevance. Knowledge, on the other hand, is the theoretical or practical understanding of a subject. The key word in that sentence is ‘understanding’, because to truly know something, one must be able describe what, how and why it is what it is... and this requires the use of facts as evidence.
Obviously the only reliable path towards an understanding of something is by knowing it, which requires understanding it and knowing so, in some kind of mutually-supportive conceptual symbiosis. There is no place or need for belief in this context. In fact, beliefs are useless in generating knowledge because they offer no power to explain anything.
Historically, belief has been used as a means by which to attempt to explain the unknown, arguing such baseless statements as the earth is flat or that humans didn’t evolve but instead magically appeared through divine intervention. In many ways, belief continues to have significant influence today. However, over time our scientific advances have allowed us to replace most beliefs of things with knowledge of things, eventually making faith and belief entirely unnecessary.
Furthermore, when knowledge is not currently available due to a lack in our understanding of something, there is no shame in admitting that we don’t know it. It certainly is a more noble approach than to invent a belief-based explanation for which we have no supporting evidence. In fact, the ability to acknowledge what we do not know is arguably the most important component to having knowledge. Socrates famously pondered the nature of knowledge by stating that it may only come with an admission that one must know that they do not know what they do not know, or something along those lines.
Perhaps it was paraphrased more effectively by then US Secretary of Defense Donald Rumsfeld when justifying the attack on a country, despite the lack of evidence that it may actually have posed a threat, when he said that “there are known knowns, there are things we know we know; there are known unknowns, that is to say there are things we know that we do not know; but there are also unknown unknowns, the ones that we don’t know that we don’t know”.
I couldn’t agree more with Rummy! However, the solution to the conundrum of both the known unknowns and the unknown unknowns is not to invent a friendly fact that may likely be untrue (and often is). The right thing to do is to say that we don’t know but will try to find out. This is the only honest path to the truth and one that is built into the scientific method of knowledge building.
Whenever someone asks me what I may believe about something or other, I always answer that I don’t believe anything at all; I either know something or I don’t. That fact also applies to everyone else as well, believe it or not! Personally, I like to think that Yoda would have said it best: “Know or do not know, there is no belief”.Read more
There is a lot of nonsense that goes around about microwaves. I’m sure you heard many of them. They destroy nutrients in food. They cause cancer if you stand next to a microwave oven. Microwaved water kills plants. All poppycock. And then there is the story about a woman who died because the blood she received in a transfusion had been warmed up in a microwave oven? The case of Norma Levitt is an interesting one and is often used by anti-microwave activists to prove that microwaves are dangerous. This case proves nothing of the sort. Here are the facts.
Norma Levitt had successful hip surgery at the Hillcrest Medical Center in Tulsa in 1989, but unfortunately died on the operating table after the procedure. She received blood during the operation which had been warmed in a kitchen microwave oven. After her death, the family launched a lawsuit claiming negligence because the blood had been warmed in a non-standard fashion. The defendants, the doctors involved in the operation, asserted that the patient had died of a blood clot, a complication of surgery. The court found for the defendants, whereupon they launched a successful lawsuit against the plaintiff’s attorneys for wrongful accusation. Each defendant was awarded $12,500.
Whenever blood is used for a transfusion it is warmed to body temperature. Heaters especially designed for this process are available in order to guard against overheating which can result in hemolysis, or destruction of the red blood cells. This in turn causes release of potassium from the cells and excess potassium can be lethal. The issue is one of overheating the blood, not of the method used. Microwave ovens heat very quickly and temperature control is difficult. That’s why they are not appropriate for warming blood. Nothing to do with microwaves being “dangerous!”
The allegations on the anti-microwave websites suggest that somehow exposure to microwaves produced some dangerous substance in the blood which killed Norma Levitt. This is nonsense. Overheating blood by any method produces the same result. No, blood should not be heated in a kitchen microwave before a transfusion, but this has absolutely no bearing on cooking with microwaves. This is a classic case of taking a smidgen of truth and twisting it out of proportion. And incidentally, the court did not find that the transfused blood was the cause of death.Read more
Marketing these days is often based not on what is in a product, but rather on what it doesn’t contain. Labels scream no cholesterol, no trans fats, no gluten, no BPA, no phthalates, no parabens and the ultimate absurdity, no chemicals. Smirnoff vodka, however, is going against this trend with the introduction of Smirnoff Gold that has a hint of cinnamon flavouring and floating flakes of pure gold. The thin slivers of gold stay dispersed through the beverage, give the product a luxurious image and draw attention to the bottle on the shelf. No health claim of any kind is made but Smirnoff promises that the cinnamon flavor is all the better due to the edible gold leaf. That is highly questionable since gold has no taste and is essentially insoluble in alcohol. It adds glitter but nothing else.
No worry about consuming the gold though, it is indeed edible and has been eaten since the days of the ancient Egyptians when it was thought to purify the body, mind and spirit. In Elizabethan England, the wealthy served meals decorated with gold leaf, and in Italy desserts decorated with gold were supposed to ward off heart disease. Alchemists searched for elixirs made of drinkable gold that would supposedly restore youth and rid the body of disease. The hope was that since gold was the eternal metal, not subject to aging in any fashion, it would transfer its anti-aging properties to whoever consumed it. They weren’t totally on the wrong track because in the twentieth century some compounds of gold were shown to have an effect on easing the pain of arthritis. Drinking Smirnoff Gold vodka may ease the pain of arthritis, but not because of the presence of any gold. Alcohol can distract from pain.
This is not the first alcoholic beverage to feature the inclusion of gold. Danziger Goldwasser, a German root and herbal liqueur which has been produced since at least 1598 features gold flakes, as does Goldschläger, a Swiss cinnamon schnapps. The name comes from the German for "gold beater," referring to the profession of beating bars of gold into micrometer-thin sheets. There is no truth to the rumour that the gold flakes are added to the beverage to make tiny cuts in the throat for quicker absorption of alcohol. The flakes are way too thin to have any such effect. So you don’t have to worry about eating the world’s most expensive pizza at Margo’s Pizzeria in Malta where a pie decorated with gold goes for about $400 U.S.
But if you really want to go on a spending spree, seek out the 666 Burger food truck in New York where for $666 you can purchase a foie gras-stuffed Kobe patty covered in Gruyere cheese that’s been melted with champagne steam and topped with lobster, truffles, caviar, and a BBQ sauce made with Kopi Luwak coffee beans that have been pooped out the Asian palm civet. The whole thing is then served in a gold-leaf wrapper. Basically it is a sarcastic comment on the super expensive burgers available in some restaurants. So far people are not lining up for the golden burger, but the food truck sure got some golden publicity.Read more
“The customer is always right,” is a time-honoured adage in marketing. It holds true even if the customer is wrong. If the customer does not want “artificial” preservatives” in food, industry will comply, whether that move is supported by science or not. Of course no company wants to poison its customers, so eliminating preservatives is a risky business. What’s the answer? Look for a “natural” preservative. That will satisfy the consumer who has a disdain for anything artificial, and at the same time will reduce the worry for the producer about marketing an unsafe product.
Kraft, for example, has announced that, at least in the U.S., it will be replacing artificial preservatives with natural ones in its cheese products. This boils down to not much more than a question of semantics. Sorbic acid and its salts, the “artificial” preservatives that have been used, are to be replaced by natamycin, an antifungal compound produced by soil bacteria. Although many cheeses are actually mould ripened, with blue cheese being the classic example, cheese is also prone to infection by a variety of rogue moulds that can cause spoilage. Sorbic acid and its salts can prevent the growth of moulds, yeast and fungi, even when used at concentrations of less than 0.1%. It was back in 1859 that Professor August Wilhelm Hofmann first isolated sorbic acid by distilling the oil obtained from the berries of the rowan tree. This is the same Professor Hofmann who was enticed to England by Prince Albert to head up the newly created Royal College of Chemistry and who essentially founded the synthetic dye industry.
So, doesn’t the fact that sorbic acid can be isolated from berries make it a “natural” substance? Yes. And I suppose there would be no clamoring to remove it from food if this is how it were produced. But distilling sorbic acid from rowan berries is not an economical process and would not do for the estimated 30,000 tons needed every year by the food industry. But sorbic acid can also be readily produced by a number of synthetic methods, including the reaction of crotonaldehyde with ketene, both of which can be made from compounds isolated from petroleum. This synthesis is economically viable and is the way that sorbic acid is produced. Any chemical is defined by its molecular structure which does not depend on the route by which it was produced. The sorbic acid produced by the rowan berry is identical to the sorbic acid produced by chemical synthesis, but because the latter was not extracted from a natural source, it is termed “artificial,” and therefore in the eyes of some people, suspect. The fact is that sorbic acid, irrelevant of the source, is a food additive that has passed all the regulatory hurdles just like its replacement, natamycin.
Natamycin is an antifungal agent produced by a soil bacterium that was first found in South Africa’s Natal province, hence the name. Since bacteria occur in nature, any of the chemicals they crank out can be classified as “natural.” But curiously a substance that occurs in nature, like sorbic acid, is termed an artificial preservative when it is synthesized in the lab. Natamycin may be natural, but it would not be so appealing to people if they knew they were eating the waste product of dirt bacteria. Not that there is anything wrong with that.Read more
When I think of the future, I can’t help but wonder what kind of world it will be when my 3 year old son Nelson is grown up. Of course I am an optimist and I love to think about the seemingly miraculous technological innovations, the peaceful pluralist nations that foster hope and inspiration, as well as a lush and rich bounty of ecological harmony (cue songbird soundtrack). But then, the nagging pessimist in me inevitably points out to myself that we are currently living in a world in which biotechnologies such as plant crops and vaccines are shunned out of ignorance, someone like Donald Trump is actually a serious (?) contender for the presidency of the United States and the very real threat of climate change looms over us like an impending jail sentence.
Whereas, I can easily dismiss the first of these 2 concerns as temporary fads, the latter, the future challenges associated with climate change from runaway global warming terrifies me and fills me with worry for my son and his generation. In this light, the future looks a little less bright.
With the expectation of melting polar ice caps and the flooding of coastal areas leading to massive human displacement of millions of people currently living in low-lying delta areas, such as the Mekong, Nile or Mississippi, we are likely to see human immigration crises that dwarf the current Syrian immigrant issue... not to mention the chaos of urban metropolitan cities like NYC and Amsterdam becoming completely submerged under sea water. Frequent and intense storms will become the norm and weather patterns are predicted to get more extreme and unpredictable. And on, and on, and on. It paints a pretty grim picture.
All of these scenarios are actually realistic models of the future and that the cause is clearly known and well understood to be from human burning of fossil fuels and releasing CO2 and methane into the atmosphere. Why then are we yet to do anything significant about preventing this problem? It goes without saying that this is a complex issue and people may be challenged by the science. However, it is imperative to note that we are also being blinded by those who do not want us to know about the science, because if we did we’d be compelled to do something about the problem.
Let’s first make something clear, there is a near unanimous consensus among scientists that actually study aspects of climate science and the earth’s geochemistry (we’re talking about some seriously smart geeks here) on the causes of this issue. It is due to human activity that emits greenhouse gases such as CO2 and methane, that is exacerbating the earth’s greenhouse effect, which itself is rapidly increasing the global warming and will lead to significant and important climate change. Why then does it seem that the jury may still be out among certain members of the public, governments or industries, we may ask? It all comes down to who is doing the talking and when it comes to the communication of the science of climate change, the fossil fuel industries unfortunately have the loudest voice.
As a professor of science, one of my main interests is in the field of science communication and the role that scientists may play in the discourse of science with members of the general public. For this reason, I am increasingly frustrated with the climate change story because it represents a classic case of a breakdown in the chain of communication from the scientific experts towards the members of the public. As I shall argue, this is due both to the inadequacies of communication skills and motivation among scientists, as well as the role that industry has played in spreading pseudoscience, falsehoods and lies.
As scientists we are not trained to speak to the public as part of our typical education and the culture of science is a solitary one, involving long hours in the lab or field and much contemplation about the implications of our results. As a general rule, scientists are not the types to seek public attention, nor to be very good at animating it, were they to be offered the opportunity. There’s a good reason that the scientist stereotype is of a closeted, bumbling and incoherent genius... because there is often a lot of truth to that.
On the other hand, fossil fuel industries are among the most profitable economic enterprises in the history of humanity and they devote a good chunk of that money to their public relations departments. As such, they have a tremendous reach and impact in promoting some ideas and squashing others, of getting some politicians elected and others disgraced and shaping policy and social attitudes all in the name of maximizing their profit margins.
This may sound like the ramblings of a paranoid conspiracy theorist but these facts are well known and part of the public record. For example, recent articles in the Washington Post, Rolling Stone Magazine and The Guardian UK clearly spell out the active role that the American billionaire oil barons, the Koch brothers are playing in the funding of puppet organizations that fight climate reform in the government, prevent subsidies to the electric vehicle industry and lobby against solar power. Or that it has recently become known that Exxon-Mobil, the single largest and most profitable business ever, had known about climate change from their own research years before it was known by the public. Over that last 30 years, however, they have rabidly fought the acceptance that climate change was real and actively funded anti-science and anti-climate organizations to the tune of millions of dollars.
In light of these formidable opponents, it becomes clear that there is an imbalance in the communication that the public is receiving on this important topic. That the science is irrefutable but yet climate skeptics are given equal airplay in TV news stories on the topic are a testament to the power of the fossil fuel industry’s lobby and that we are being lied to so that they can continue to pollute while we debate idly amongst ourselves.
Well a few things seem obvious to me: firstly that there needs to be a more reasoned discussion of the science of climate change in the public and secondly, that those in charge of the climate deception are likely to be viewed by history as having committed crimes against humanity for all of the millions of people that they will have doomed to lives of misery.
As a father and a science professor, I can do my part to fight this climate ignorance by helping to usher in a new era of science communication, one that empowers people to separate the sense from the nonsense so that together we can combat these weapons of deceit with knowledge. As another Nelson (dear ‘Madiba’ Mandela, who inspired our son’s namesake) would have said much more eloquently than myself: “Education is the most powerful weapon you can use to change the world”. We won’t let the Kochs and Exxons of the world take that power away from us anymore.Read more
Since the late 1940s, so-called “subtherapeutic” doses of antibiotics have been routinely added to animal feed to prevent disease and to increase feed efficiency. Exactly why animals put on weight more readily when exposed to small doses of antibiotics isn’t clear, but it may have to do with reducing the competition for nutrients by cutting down on the natural bacterial population in the animals’ gut. Some studies also suggest that antibiotic use thins the intestinal wall and increases nutrient absorption. What has become clear, however, is that such subtherapeutic use of antibiotics leads to the flourishing of antibiotic resistant bacteria in animals and that such bacteria can infect humans. Chickens, for example, will begin to excrete antibiotic-resistant E. coli in their feces just 36 hours after being given tetracycline-laced feed. Within a short time these bacteria also show up in the feces of farmers. And a truly frightening prospect is that bacteria can pass genes between each other, including the ones that make them resistant to antibiotics. This means that bacteria that have never been exposed to an antibiotic can acquire resistance just by encountering resistant ones. Then consider that animals shed bacteria in their feces and that manure is used as fertilizer, and fertilizer gets into ground water, and it quickly becomes evident how the bacterial resistance problem can mushroom.
Thorough cooking of course kills bacteria, but the widespread incidence of food poisoning demonstrates that poor food handling and undercooking is common. True, most people who come down with bacterial food poisoning just experience some unpleasant cramps and diarrhea and recover without the need for antibiotic treatment. In this case resistance is not an issue. But there are numerous cases of children, the elderly, or people whose immune system is compromised, who need antibiotic treatment for food poisoning. And now if the bacteria are resistant to antibiotics, these patients can face a dire situation. Take for example the case of an unfortunate Danish woman who died in 1998 after eating Salmonella-infected pork. She failed to respond to ciprofloxacin (Cipro), the usual antibiotic of choice, because of bacterial resistance. In a piece of elegant research, Danish scientists succeeded in genetically matching the Salmonella-resistant strain to a specific pig farm. Surprisingly, these pigs had not been treated with ciprofloxacin, but the pigs on neighbouring farms had been, and the resistant bacteria had moved between farms!
In North America antibiotics known as quinolones have been used since 1995 to treat infections in poultry. While this was great for the chickens’ health, it turned out not to be so good for humans. The most common cause of bacterial gastroenteritis in people is Campylobacter jejuni, and poultry is often responsible. If an antibiotic is needed, ciprofloxacin is the usual choice. But since the introduction of quinolones to farm animals, Campylobacter strains resistant to the drug have emerged. The Food and Drug Administration in the U.S. has recognized this as such a serious problem that it has made Baytril, a quinolone, the first veterinary drug to be banned because of the emergence of resistant bacteria. While this is the first action of its kind in North America , Europeans have been phasing out antibiotics in animal feed since the 1980s. Sweden banned the use of antibiotics as growth promoters in 1986 and Swedish farmers responded by improving hygiene on farms and by altering feed composition. They showed that meat can be produced for the consumer at virtually the same cost as with antibiotics. And without a cost to consumers’ health! The European Union has followed suit and on January 1, 2006 banned the use of antibiotics as growth promoters in animal feed. That actually hasn’t resulted in a huge reduction in antibiotic use. While the prophylactic use has decreased, there has been an increase in the therapeutic use of antibiotics in animals because there has been an increase in illness that apparently was being prevented by antibiotics added to feed.
Antibiotics are wonderful drugs and we must do all we can to protect their efficacy. While certain uses of antibiotics to treat sick animals are justified, as one scientist who studies antibiotic resistance opined, “Cipro is an essential antibiotic, and we cannot allow its effectiveness to be compromised by squandering it on poultry.”Read more
Shrek the friendly ogre delighted audiences in the 2010 movie hit “Shrek Forever After.” But for fast food giant McDonald’s, Shrek turned out to be a nightmare. As a cross promotional feature, the company introduced a set of glasses decorated with images of Shrek and other characters from the film. After millions of the glasses had been sold, a problem cropped up that led to a large scale recall. The yellow pigment used on the cups turned out to be cadmium sulphide, a substance toxic even in small amounts. The concern was that the pigment might rub off on children’s hands and end up being ingested if they then put their hands into their mouth.
Cadmium was discovered in 1817 by Professor Friedrich Strohmeyer in Germany while looking into a problem encountered by apothecaries who were making calamine lotion for skin care. The process involved heating “calamine,” a natural ore of zinc carbonate, to produce zinc oxide, which is the active ingredient in calamine lotion. Sometimes the lotion would end up with a yellow discolouration which Strohmeyer determined was due to a mineral contaminant that he eventually identified as a compound of cadmium.
It was the colour of cadmium compounds that led to their first commercial use. Artists loved the bright yellow of cadmium sulphide and the reds and oranges resulting from a mixture of cadmium sulphide and cadmium selenide. Vincent van Gogh used cadmium sulphide to impart the yellow colour to his flowers in his famous “Flowers in a Blue Vase.” Unfortunately, with time, cadmium sulphide oxidizes to cadmium sulphate, which is white, resulting in the original colour of the painting being slowly altered. Claude Monet's famous yellow hues were also achieved with cadmium pigments.
Cadmium paints are still used today, although they are being phased out. Indeed, Sweden has submitted a report to the European Chemical Agency claiming that artists rinsing their brushes in the sink are responsible for spreading cadmium over agricultural land via sewage sludge.
Cadmium is a cumulative toxin and the World Health Organization has suggested 70 micrograms as the maximum daily safe intake. Ingesting some cadmium is unavoidable because it shows up in crops. How does it get there? Sewage sludge and phosphate rock, both used as fertilizer, can harbour cadmium. As a result, a hamburger can contain about 30 micrograms of cadmium that can be traced to the grass or hay the cow ate, and ultimately to the soil in which the feed was grown. Coal also contains cadmium compounds that can end up in the atmosphere from where they find their way into soil via rain. Other cadmium compounds may also be released from the nickel-cadmium battery industry, although modern pollution control methods minimize such losses. Cadmium can be also be found in significant amounts as a contaminant in zinc ores and some is released into the environment when the ore is mined as well as when it is smelted into zinc.
Nobody actually carried out a study to determine how much cadmium pigment can rub off onto little hands when gripping a Shrek glass, but it could well be less than what is found in the hamburger those hands are clutching. Still, eliminating any avoidable source of cadmium is desirable, especially since there is suspicion that cadmium compounds may be carcinogenic. Cadmium can also build up in joints and the spine causing a disease that the Japanese have named “Itai-Itai,” which translates as “ouch-ouch,” due to the painful sounds made by victims as cadmium accumulates.
A classic case of environmental cadmium toxicity can be traced back to the early 1900s, although its cause was not identified until the 1960s. It was obvious that something was going on in the vicinity of the Jinzu River and its tributaries in China. People were getting sick, screaming in pain and dying prematurely. Suspicion fell on the river and the mining companies that for years and years had been disgorging their wastes into the water. The mountains upstream were rich in minerals that contained silver, lead, copper and zinc, and mines had been operating there for centuries. As demand for these metals increased in the twentieth century, more and more mining wastes found their way into the river, including increased amounts of cadmium ores.
River water was used for irrigation of rice fields, and since rice absorbs cadmium effectively, the metal accumulated in the food supply and consequently in the bodies of the population. The result was ouch-ouch disease. Although cadmium was only identified as the cause around 1965, by the late 1940s it had become obvious that the disease was linked to the water supply and mining companies began to store their wastes instead of releasing them into the river. This prevented more people from contracting cadmium poisoning, but nobody really knows how many victims the mining operations had since they began to pollute the Jinzu River back in the sixteenth century.
In 1966 in England a construction worker died and several others were sickened as a result of inhaling cadmium fumes. The men were using a welding torch to remove bolts as they were dismantling a construction tower used in the building of a bridge. It is common practice to electroplate steel bolts with cadmium, particularly those exposed to water. This is especially useful when there is contact with sea water since cadmium reacts with salt to form an impervious layer of cadmium chloride. In this case the men inhaled the cadmium vapourized by the heat of the welding torch and suffered an acute reaction.
Shrek glasses are not the only items aimed at children that have caused a concern about cadmium. With lead being non-grata, cadmium has been turning up in jewelry aimed at young girls, mostly originating in China. If pieces are accidentally swallowed, or if the jewelry comes into frequent contact with the mouth, enough cadmium may enter the circulation to cause harm. Jewelry made with cadmium should to go the way of the Shrek glasses.Read more
These days an array of books, magazines, websites and numerous bloggers promote a variety of flakey “detox” schemes. Our pal Vani believes in a concoction made from celery, cilantro, cucumber, lemon and ginger root. Others promote “detoxing” with lemons, maple syrup and cayenne pepper There is no mention of what “toxins” are being removed, how they are being removed or what evidence there is that they have been removed. While claims about detox “cleanses” are pure nonsense, the concept of detoxication is real. There are various ways the body deals with foreign substances with the use of cytochrome p450 enzymes being a classic example.
The “p” in the name of this family of enzymes refers to “pigment” and 450 refers to the specific wavelength of light used to spectroscopically identify these coloured (chrome) molecules that are found inside cells (cyto). A spectroscopic study involves exposing a sample to various wavelengths of light and determining which wavelengths are absorbed, a common laboratory identification technique.
Our bodies are constantly exposed to foreign substances that have to be dealt with before they can cause harm. Aromatic hydrocarbons in smoke, bacterial toxins, remnants of industrial chemicals as well as food components such as caffeine, tannins and alkaloids have to be eliminated before they engage in harmful reactions. Drugs are also viewed by the body as intruders that need to be eliminated.
Many of these substances that we collectively call “toxins” are relatively insoluble in water, and therefore present a challenge for elimination through the kidneys. The cytochrome enzymes add oxygen atoms to these molecules, increasing the ease with which they can be flushed out of the system. In some cases the addition of oxygen just makes the compound more soluble and therefore more readily excreted through the urine. In others, the oxygen atom introduces a reactive site that molecules such as glucuronic acid can latch on to. Since glucuronic acid is highly soluble, it acts as sort of a “ferry” to help eliminate compounds that it has been able to grab. Cytochrome enzymes are found mostly in the liver, the organ charged with intercepting dangerous substances before they can enter the circulation and wreak havoc with more susceptible tissues.
But these enzymes can also be looked upon as the proverbial “double-edged sword.” A drug, such as the pain-killer acetaminophen, is a candidate for detoxication as it passes through the liver. The reason such medications have to be taken every few hours is that they are constantly being removed from circulation by the cytochrome enzymes. Indeed, the reason that people have different sensitivities to drugs is a consequence of having a different profile of cytochrome p450 enzymes.
Another problem is that sometimes the oxygenated form of a compound may be more toxic than the original. For example, benzopyrene in smoke, a relatively harmless intruder is converted by cytochrome enzymes into a form that can react with and disrupt DNA. In this case, the oxygen provides a “handle” that DNA can seize.
Interaction of toxins with cytochrome p450 can raise yet another concern. Furanocoumarins, naturally occurring compounds in grapefruit juice, for example, can react with these enzymes and prevent them from carrying out other detoxicating tasks. That’s why patients are told not to take their medications with grapefruit juice. Since the enzymes are not available to eliminate the drug, it can build up and have a greater effect than intended. Certain blood pressure lowering drugs, felodipine for example, can then have the effect of decreasing the blood pressure to dangerous levels.
Finally, there is also the possibility that some substances the body perceives to be toxins can induce the formation of higher than normal levels of cytochrome enzymes in an attempt to eliminate the intruder. In the process other substances may be undesirably eliminated as well. There have been cases of organ rejection attributed to patients taking antidepressants such as St. John ’ s wort when they were being treated with the immunosuppressive drug cyclosporine. The increased levels of cytochrome enzymes can cause more rapid elimination of cyclosporine, resulting in organ rejection.Read more
The year 1828 was a good one for chemistry. In Germany, Friedrich Wohler demolished the theory that “natural” substances could not be made in the laboratory because they harboured some sort of “vital” force that could not be reproduced. Wohler managed to convert ammonium isocyanate into urea, a “natural’ compound found in the urine. And in Holland Conrad Van Houten accomplished something that would have almost as far reaching consequences. Heinvented a press to separate cacao butter from cacao beans and made the production of chocolate as we know it possible!
The source of all chocolate is the cacao bean. These are found inside the pods that are the fruit of the cacao tree. First, the beans are removed from pods and heaped into a pile where they begin to ferment. Sugars start converting to alcohol and acetic acid. As the acid concentration increases, the sprouts inside the beans die and release enzymes that break down the proteins and sugars to a host of flavorful compounds with tongue-twisting names like phenyl acetic acid, dimethyl sulfide, 2-methoxy-4-methylphenol and 1-methylnaphthalene. These are just some of the 800 or so compounds that together make up the flavor of chocolate. Among them is theobromine which is toxic to dogs but mercifully not to humans.
Roasting follows, causing the beans to turn brown. This is a consequence of the “Maillard reaction” by which amino acids react with sugars to form brown pigmented "melanoidins." After roasting, the sprouts, or "nibs" as they are called in the trade are easily removed and are ground to a substance that hardens and forms "chocolate liquor." This was the earliest and most primitive form of chocolate. When it was mixed with water it formed a beverage that was bitter and was tainted by an oily layer of fat floating on top. This is where Van Houten stepped in. He developed a hydraulic press by which the cacao fat could be squeezed out of the chocolate liquor leaving a residue that could be pulverized to a fine powder. It was still bitter but Van Houten found that if he treated it with sodium bicarbonate or ammonium hydroxide, the bitter tasting compounds were destroyed and a mild flavored powder was produced. This made for a far better beverage than had been available up to that time, but perhaps Van Houten's biggest contribution was the separation of the cacao fat.
About twenty years after his discovery, J.S. Fry in Britain found that the addition of cacao fat and sugar to chocolate liquor would produce a chocolate bar. Previously chocolate was always consumed as a beverage, but now it could be enjoyed as a solid treat. Then in 1876 Henri Nestle and Daniel Peter found that adding condensed milk to the bar gave a milder flavor and American Milton Hershey devised methods of mass production. Chocolates have been delighting young and old alike ever since with their 800 or so compounds that make up chocolate flavor. One of these is furfuryl alcohol, a chemical that has been used as rocket fuel. I wonder if Ms. Hari will consider issuing a warning about chocolate because rocket fuel surely doesn’t belong in our body.Read more
If you buy a chemical-free product, you’re not getting a good deal. You’re buying nothing. A vacuum. What’s a vacuum? A space empty of all matter. And what is matter? Anything that has mass and occupies space. What is matter made of? Simple. Chemicals. Everything in the world is made of chemicals which encompass everything from simple elements like gold to incredibly complex molecules like DNA. There are over sixty million known chemicals, both naturally occurring and synthetic. They are not good or bad, their safety and usefulness depend on which chemical we are talking about, how much of it, and in what context. Sugar in your mouth is fine but you don’t want it in your gas tank. A small dose of coumadin can prevent a blood clot, a larger dose is great for killing rats in the basement.
Today, unfortunately, many people don’t regard chemicals as the constructs of matter, they consider them to be the agents that deconstruct life. The word chemical itself has become a dirty word, synonymous with poison or toxin. This is especially the case when it comes to synthetic chemicals, which are thought to be particularly sinister. The truth is that the safety of a chemical does not depend on its ancestry; whether it was made by Mother Nature in a bush or by a chemist in a lab is irrelevant. The properties of a molecule are determined by its molecular structure and its safety can only be determined by studying the chemical. And most assuredly, its safety does not depend on the number of syllables in the chemical name. Yet how often do we hear advice like, “if you can’t pronounce it you shouldn’t be eating it!”
It isn’t only the chemically illiterate who see chemicals as the roadblocks to a healthy life. Mainstream producers are also trying to capitalize on the anti-chemical fervor. McCain Foods initiated a campaign promising to use only “real ingredients.” What does that mean? They claimed to remove “unfamiliar ingredients.” Specifically mentioned were sodium steroyl lactylate and sodium ascorbate by name. Why remove these? Marketing. Both are approved food additives and have undergone rigorous testing. Sodium steroyl lactylate is an emulsifier used in baked goods, like pizza dough, that disperses the fats in the dough, allowing less fat to be used while softening at the same time. Sodium ascorbate is the sodium salt of vitamin C and is used as an antioxidant to prevent fat from going rancid. These additives actually make for a better dough. McCain also makes a big deal out of using only vine-ripened tomatoes. That’s great. The riper the tomato, the more natural ascorbate it contains. So while the company sings the praises of taking out ascorbate with one hand, it increases the amount of the same chemical with the other.Read more