From Our Contributors
What scientific debate today is more discussed than that of how to make tea? As the issue currently stands (a hotly-contested and bitter stalemate), the field is divided into two main camps: the Tea-Firsts and the Milk-Firsts. Not since Swift’s Endian Dilemma over how to crack an egg has such fierce culinary fervor been whipped up without even so much as a whisk. I hope here to present a thorough documentation and comparison of several of the extant expert opinions and published work on the subject, with a critical eye toward empiricism, chemistry, and humor.
Ignoring those who leave tea un-milked or drink a tea incompatible with milk or unthinkably drink something other than tea, most will either introduce milk, cream, half-and-half, or correction fluid into their cup, mug, thermos, or volute krater before tea, or after (one small subculture subscribes to simultaneously introducing the two but is decried as heretic by both major schools of thought). The British are clearly the experts, and there is no dearth of literature from their corner. George Orwell establishes himself staunchly on the Tea-First side in “A Nice Cup of Tea,” proclaiming, “by putting the tea in first and stirring as one pours, one can exactly regulate the amount of milk whereas one is liable to put in too much milk if one does it the other way around.”
Douglas Adams, however, is an unashamed Milk-First, outlining his position in “Tea” from The Salmon of Doubt: “It’s probably best to put some milk into the bottom of the cup before you pour in the tea. If you pour milk into a cup of hot tea, you will scald the milk.” A footnote says, “This is socially incorrect. The socially correct way of pouring tea is to put the milk in after the tea.” This adds the benefit of etiquette to the Tea-Firsts.
But writers, citizenship notwithstanding, are not necessarily experts; the problem is a scientific one. The International Organization for Standardization (ISO), whose expertise in the field of making things make sense evidently does not extend to acronyms, in 1980 described precisely and finally a standardized method for brewing tea. The standard, winner of the 1999 Ig Nobel Prize for Literature, also advocates Milk-First brewing. While it is important to note that ISO standards are not so much suggested ways to make good tea, but rather detailed descriptions for making consistent tea, ISO 3103:1980 logically represents an ideal modus operandi in the eyes of the largest standards organization in the world.Read more
In 1859, Charles Darwin set out his theory of evolution by natural selection, a theory that consists of three vital components – variation, inheritance and differential reproductive success. We are not all as beautiful as Angelina Jolie, for example (number one on the Official Top 30 World’s Most Beautiful Women of 2013), but her biological children likely will be, and no doubt many a man would be willing to father those children. Parents and offspring can share more than looks, however, and we can only hope that Angelina’s biological daughters are not also at risk of developing breast cancer. Without inheritance, adaptations (such as good looks), and maladaptations (for example heritable cancer risk) alike, could not be passed on from one generation to the next. A century’s worth of work aimed at understanding this process of inheritance, including experiments with peas and bacteria viruses, finally culminated in 1953, when James Watson and Francis Crick revealed the structure and properties of DNA, the molecule that carries genetic information from generation to generation.
Eight years later in 1961, a young biochemist Marshall Nirenberg made a breakthrough discovery that allowed the genetic code – ‘the blueprint for life’ - to be deciphered. According to the code, the great diversity of life on our planet is generated in a remarkably simple manner - a 5-letter alphabet (the bases or nucleotides A, C, G, T and U), written to form 64 three-letter words (codons, such as AGG), codes for twenty amino acids that serve as the building blocks of proteins, and in turn the building blocks of life. Discovering the genetic code allowed us to understand how the variation essential for Darwin’s theory of natural selection could arise. Mutations that cause a change of a base from an A to a C, for example, can cause a different amino acid to be produced and a different protein to be built. This might sound trivial, but such minor changes in the genetic code can have dramatic impacts on an organism – how it appears physically, how it behaves, how likely it is to develop cancer. Who we are as humans and our uniqueness relative to others appeared to be controlled quite tangibly and inflexibly by what is written in our DNA.
It almost seems foolish to have believed it so simple. The burgeoning field of epigenetics has more recently forced us to see the DNA world in a completely different light. Epigenetics (‘epi’ = ‘on top of’ or ‘above’) is the study of changes in gene activity that are not caused by changes in the DNA sequence. You can see the dilemma here – modification of gene function without change in the nucleotide sequence surely breaks the previously accepted rules. Read more
Long chemical names on the back of food labels very frequently send people running for the hills. These “food additives” quickly get pegged as dangerous food fillers out to ruin the digestive systems and the very lives of consumers. One of these “scary” chemicals found in our daily diet is carboxymethylcellulose, also known as cellulose gum. It comes from the cell walls of plants, and is used to make paper. That’s right – trees, in your food. Sort of.
Fiber One, Pillsbury, Betty Crocker, and even Duncan Hines, whose Devil’s Food Cake mix pierces both your heart and arteries, contains cellulose gum. And while many resent the “lies” spewed by large food companies, proclaiming that additives are taking away from other “natural” ingredients, remember that every food we eat is made up of chemicals.
Apples are known to contain trace amounts of cyanide, and cassava, more commonly known in North America as tapioca, has enough cyanide to actually kill a man. And yet, there is no hesitation in eating these products. It is important to realize that the synthesis of a compound in a lab can be safer than one found in nature. Nature, more frequently than not, produces the most toxic compounds known to man.
The U.S. Food and Drug Administration, along with international organizations, regulate food additives, such as cellulose gum. After the passage of the Food Additives Amendment in 1958, any ingredient must be reviewed by the FDA to be approved as an additive.
Cellulose is an abundant, natural polysaccharide found in all plants. Cellulose gum is water-soluble gum based on cellulose. Manufactures will use an acetic acid derivative, the same acid found in vinegar, to break down the cells and form the gum. It has been used for over 50 years as a thickening agent, a stabilizer, and an emulsifier.
Gums provide many positive functions in food, without the benefit of not changing the flavor of the food to which they’ve been added. Read more
On a crisp Saturday morning, Internal Medicine Residents from all over the state of New York trickled into the University of Rochester’s School of Medicine, sporting ties and dresses in lieu of their usual white coats and scrubs. In their hands they clutched a precious cargo, a cylinder that protected a poster to be used for a presentation about their research or about an unusual clinical case.
I was among the ninety-seven young (when does one cease to be called young, I wonder?) residents who had been selected to present their work at the annual New York American College of Physicians’ abstract competition. I had to rush to the airport right after work on a Friday afternoon, and then with barely six hours of sleep, (no, six hours is not enough) drag myself up and out into the cold air of Rochester, firmly gripping my poster, all the while wondering why I was giving myself all this extra work on my only free weekend of the month.Read more
The zeal to recommend extreme reductions in sodium...is a case of ideology replacing good science.” Is this the statement of some right-wing newspaper columnist or food industry executive? No. This is Dr. Salim Yusuf, the Heart and Stroke Foundation chair in Cardio- vascular Disease at McMaster Univer- sity, arguing that there has been far too much focus on the policy of sodium reduction as a means to curb cardio- vascular disease. Immediately, another leading Canadian scientist, Dr. Norman Campbell of the University of Calgary, came out swinging, not only disputing Yusuf’s science as having “fatal flaws,” but getting down in the scientific gutter questioning his competence in the field by claiming that Yusuf “is way off his expertise...he doesn’t have a strong understanding of what the evidence is.” Not to be outdone, Yusuf countered that while he considers that Campbell is well-meaning, the poor chap is basing his dramatic public health measures on “scant” evidence. Moreover, “Norman has been one of those — in polite terms — evangelists about sodium — in impolite terms, Talibans about sodium.” Them’s fighting words!
With this level of “scientific” debate, what’s the consumer or policy-maker to do? Only two years ago sodium reduction was widely presented as an area of relatively settled science, and senior managers (and the minister) were criticized for not follow- ing more aggressively their scientists’ advice to get tougher with the food industry.Read more