I recently posted about the effects of demineralized water on my equanimity. But lack of minerals may have been only part of my problem. A few months ago, I began eliminating sweets from my diet, which helped me reach my goal. For years I had been rewarding myself at lunches and dinners with cookies, cakes, chocolate bars or ice cream, or some combination thereof. I bought and ate the best organic sweets, but they still contained a great deal of sugar. Statistics from The Diabetes Council indicate that US citizens consume over 126 grams of sugar per day.
A theory goes that we prefer sweetness because it indicates consumable carbohydrates in ripe fruit while sourness indicates unripe and bitter indicates spoiled or poisonous fruit. Refined sugar is, of course, very sweet, and it is difficult to find prepared foods that do not contain refined sugars such as high fructose corn syrup. As described in The Conversation, it is easy to get addicted to high levels of sugar in your diet:
Like drugs, sugar spikes dopamine release in the nucleus accumbens. Over the long term, regular sugar consumption actually changes the gene expression and availability of dopamine receptors in both the midbrain and frontal cortex. Specifically, sugar increases the concentration of a type of excitatory receptor called D1, but decreases another receptor type called D2, which is inhibitory. Regular sugar consumption also inhibits the action of the dopamine transporter, a protein which pumps dopamine out of the synapse and back into the neuron after firing.
In short, this means that repeated access to sugar over time leads to prolonged dopamine signalling, greater excitation of the brain’s reward pathways and a need for even more sugar to activate all of the midbrain dopamine receptors like before. The brain becomes tolerant to sugar – and more is needed to attain the same “sugar high.”
After the dopamine receptors have become less available, signalling from normal levels of sugar consumption fails to “reach” the dopamine receptors – until you reacclimate. In the meantime, you can suffer the symptoms of dopamine deficiency. Medical News Today offers a long list of symptoms, but in myself I noticed:
trouble sleeping or disturbed sleep
feeling inexplicably sad or tearful
I initially chalked these symptoms up to low zinc in my drinking water, but they may have been from low dopamine, or from both. Fortunately I seem to be acclimating, but it was a strange couple of months.
In 1977, National Lampoon parodied Scientific American as “Scienterrific American.” I think they were on to something. I’ve written a few posts about whether we should trust scientists, whether scientists can trust each other, etc. Sadly, some scientists will publish what they are paid to publish, and some will publish whatever makes headlines, so they can continue to work. Some of their results are not reproducible, which means they aren’t really doing science. The charitable view is that eventually the scientific method will sort out the scientific from the scienterrific, but a lot of us were ingesting PFOA from Teflon long before we were told that it was a carcinogen.
Recent headlines advised that the FDA had banned sales of many antibacterial soaps, containing any of over a dozen chemicals, because “the risks outweigh the benefits.”
Studies in animals have shown that triclosan and triclocarban can disrupt the normal development of the reproductive system and metabolism, and health experts warn that their effects could be the same in humans. The chemicals were originally used by surgeons to wash their hands before operations, and their use exploded in recent years as manufacturers added them to a variety of products, including mouthwash, laundry detergent, fabrics and baby pacifiers. The Centers for Disease Control and Prevention found the chemicals in the urine of three-quarters of Americans.
That New York Times article notes that a trade group, The American Cleaning Institute, opposes the FDA ruling, and claims to have studies that support their opposition. I’m sure they do.
Scientific American (the real one) has posted an excerpt of a book, Let Them Eat Dirt: Saving Your Child from an Oversanitized World, written by two microbiologists: B. Brett Finlay, Ph.D., and Marie-Claire Arrieta, Ph.D.
Finlay and Arrieta point out that while antibiotics have certainly saved many, many of us from dying young from an infectious disease, they have also changed our environment in more subtle ways. Besides the fear about developing unstoppable superbugs, we may be making ourselves susceptible to a raft of non-infectious diseases. One concern is the use of antibiotics in meat, another is the use of antibiotics in early childhood:
While these studies didn’t prove that antibiotics directly cause obesity, the consistency in these correlations, as well as those observed in livestock, prompted scientists to have a closer look. What they found was astonishing. A simple transfer of intestinal bacteria from obese mice into sterile (“germ-free”) mice made these mice obese, too! We’ve heard before that many factors lead to obesity: genetics, high-fat diets, high-carb diets, lack of exercise, etc. But bacteria—really? This raised skepticism among even the biggest fanatics in microbiology, those of us who tend to think that bacteria are the center of our world. However, these types of experiments have been repeated in several different ways and the evidence is very convincing: the presence and absence of certain bacteria early in life helps determine your weight later in life. Even more troubling is the additional research that shows that altering the bacterial communities that inhabit our bodies affects not just weight gain and obesity, but many other chronic diseases in which we previously had no clue that microbes might play a role.
In How Demographics Rule the Global Economy, the Wall Street Journal offers seven interesting articles – Population Implosion, The End of Cheap Labor, Manufacturing Bust, Girl’s Life, Gender Gap, Promise of Youth, and Aging Gracefully – on how demographics may affect the world economy, particularly the growth paradigm.
Ever since the global financial crisis, economists have groped for reasons to explain why growth in the U.S. and abroad has repeatedly disappointed, citing everything from fiscal austerity to the euro meltdown. They are now coming to realize that one of the stiffest headwinds is also one of the hardest to overcome: demographics.
Next year, the world’s advanced economies will reach a critical milestone. For the first time since 1950, their combined working-age population will decline, according to United Nations projections, and by 2050 it will shrink 5%. The ranks of workers will also fall in key emerging markets, such as China and Russia. At the same time the share of these countries’ population over 65 will skyrocket.
There are a great many charts, some of which may be accurate, others of which may be skewed to satisfy the WSJ editorial view. The authors repeat the usual claims that Malthus definitively predicted a Malthusian catastrophe, which I discussed in Dissing Malthus Again, and that Ehrlich and the Club of Rome were wrong to be wary of overpopulation. They take a shot at Alvin Hansen, a Keynesian who advocated government spending to get us out of the Great Depression when he should have known that World War II would do the trick.
Probably the biggest takeaway is that the West will skew older and older while the far more populous Asia skews younger but so decidedly male that brides are already hard to find. Only Africa seems to be on track to continued population growth:
Simply put: A baby boom will lift the poorest continent on Earth into the center of global affairs. Africa will soon become the world’s most reliable source of new life: of college graduates, young workers and budding consumers.
It should be noted that the WSJ article does not consider the effects of climate change, protracted proxy wars, or refugee relocation in these expectations.
A week after posting The Ugly Little Boy, I ran across First Peoples on PBS. There are five episodes: Americas, Asia, Africa, Australia and Europe. I find it worth watching, though I expect that in ten or twenty years some of the theories presented will be superseded as more discoveries are made.
Wall to Wall Television includes brief, illustrative scenes of both archaic and early modern humans, and to my eyes, they seem to have tried hard to cast appropriate-looking actors. East African modern humans, like Omo-1, are played by dark-skinned African-looking men and women, while the Neanderthals are somewhat lighter-skinned with bushy hair and heavy prosthetic facial features. Eva of Naharon, found in the Yucatan, looks like a dark Latina woman and the woman from Tam Pa Ling, found in Northern Laos, looks like a dark Eurasian woman. Oase Boy, found in the Carpathian Mountains, looks like a modern Caucasian. The Clovis Makers in Northwest America are shown just about as light-skinned as present day native Americans, while Kennewick Man (the Ancient One) is somewhat darker.
While discussing how interbreeding between modern and archaic species may have occurred, some scenes show a small group homo sapiens nervously moving through a landscape, then meeting up with a group of homo erectus and sharing some food. Later another group of sapiens meets up with homo neanderthalis and shares a campfire. Such meetings did (sometimes) happen between European trappers and native Americans, so it is possible that archaic and modern humans met peacefully, traded goods and either intermarried willingly or sold wives.
It is also possible, though, that one group may have wiped out another and taken women and children as captives. Or perhaps one group may have raided another for the express purpose of stealing women. But watching dark-skinned invaders overwhelm lighter-skinned tribes would certainly be fodder for race-baiting in today’s racially-charged atmosphere.
Speaking of Kennewick Man, after nineteen years of study, it seems about time for the Corps of Engineers to let the Umatilla, or some other tribe, bury him with some dignity.
Way back before the internets, Isaac Asimov wrote, The Ugly Little Boy, which he included in his anthology Nine Tomorrows. Robert Silverberg later expanded the short story into a novel, which I have not read. In 1977, Barry Morse and Kate Reid starred in a TV version, which is supposed to be very faithful to the short story and is available on youtube.
The boy was a neanderthal (or neandertal) child, brought to, and kept in, the future at great expense of energy by a corporation for scientific research. In the years after the story was written, I read that one scientist claimed we probably couldn’t tell a well-dressed neanderthal apart from anyone else on the street, but knew that when the lay person heard, “neanderthal” they saw a dim but muscular caveman with a sloping forehead. And that is how most popular culture has portrayed them, one example being Jean M Auel’s Clan of the Cave Bear books and the 1986 film starring tall, pale, blonde Darryl Hannah as a Cro-Magnon, adopted by a tribe of stocky, swarthy (but not black-skinned), black-haired Neanderthals.
But if one were to recast Clan of the Cave Bear based on the latest information, one should cast light-skinned people as the Neanderthals, and a taller, darker-skinned woman (perhaps Rosario Dawson) as Ayla the Homo Sapiens Sapiens or Early European Modern Human (EEMH). It is now suggested that populations of Homo Neanderthalensis had already adapted to Northern climates over some three or four hundred thousand years, and that the still dark-skinned Homo Sapiens benefited by acquiring those traits through interbreeding as they displaced the older species. [It is also counter-suggested that the neanderthal DNA remains from before the two species diverged from Homo Erectus.]
It is currently thought that humans (except those strictly descended from sub-Saharan Africans) have between 1% to 4% of neanderthal DNA and that some Melanesians and Australian Aborigines have Denisovan DNA as well. In other words, most of us humans are actually ugly little boys and girls, too.
In Nautilus, Philip Ball writes, The Trouble With Scientists:
… In 2005, medical science was shaken by a paper with the provocative title “Why most published research findings are false.” Written by John Ioannidis, a professor of medicine at Stanford University, it didn’t actually show that any particular result was wrong. Instead, it showed that the statistics of reported positive findings was not consistent with how often one should expect to find them. As Ioannidis concluded more recently, “many published research findings are false or exaggerated, and an estimated 85 percent of research resources are wasted.
… the problems of false findings often begin with researchers unwittingly fooling themselves: they fall prey to cognitive biases, common modes of thinking that lure us toward wrong but convenient or attractive conclusions. …
Psychologist Brian Nosek of the University of Virginia says that the most common and problematic bias in science is “motivated reasoning”: We interpret observations to fit a particular idea. Psychologists have shown that “most of our reasoning is in fact rationalization,” he says. In other words, we have already made the decision about what to do or to think, and our “explanation” of our reasoning is really a justification for doing what we wanted to do—or to believe—anyway. Science is of course meant to be more objective and skeptical than everyday thought—but how much is it, really?
Science is supposed to be grounded in reality by following the scientific method, and its safety valve is the peer review process. If the author of a paper is wrong, the ideal is that another scientist will point out the error, and the author will revise or withdraw his theory. Ball notes that the peer review process has flaws, and discusses Nosek’s proposals to fix it, but doesn’t address corporate control of the scientists. When a great deal of money hinges on the results, we find scientists attacking each other’s credibility, with some quietly taking a great deal of money to endorse an industry-friendly position.
Witness ongoing political and media debates around competing studies on climate change, whether certain injections caused autism, the long term effects of nuclear fission powerplants, whether glyphosate is a carcinogen, and what constitutes a proper diet. Even evolution is still up for debate in our public school system.
Last weekend, I waited out a cold in bed, and watched the last half of a 1957 scifi flick, The 27th Day, which I had seen as a boy and vaguely remembered. The plot was simple enough. For their own reasons, aliens snatched up an American reporter (Gene Barry), a British woman, a Chinese girl, a German physicist and a Soviet soldier. (The females had no careers because this was 1957 and women just stayed home, or something.) To each they gave three capsules that could eliminate all human life within a 1500 mile radius of a given coordinate. The capsules would only function for 27 days, probably because the more advanced a culture, the faster its tech becomes obsolete. The aliens then sent them home, but shortly afterwards announced the situation worldwide. To their credit all five tried to avoid letting the weapons be used. The poor Chinese girl killed herself while the British woman sensibly dumped her capsules into the channel then less sensibly flew to LA to join Gene Barry in hiding. The Soviets needed torture, sodium pentathol and a good bit of lying to get the capsules from the soldier. The physicist happened to be in LA, too, and was being held by a US government that hadn’t yet adopted torture as a matter of course.
As it turned out, those pesky Soviets were waiting until the last minute to cleanse North America of capitalists, but like many movie scientists (and Spock and eventually Sheldon Cooper), the German physicist was also brilliant outside his chosen field. He analyzed some alien markings on the capsules and decided that he could use them to kill only the bad people. Luckily he didn’t screw up that alien conjugation, or he might have killed only the good people. Or the left-handers. In any case he killed a bunch of people whose only crime was being an Enemy of Freedom with no due process. How Randian.
In the movies there are always those good scientists who heroically risk everything to save the world and those evil scientists who risk destroying the world because they like science better than people. In the real world, most scientists spend a lot of time at the bench and test their stuff eight ways from Sunday before even writing a paper, but they are often funded by venal people that are more devoted to the necessities of commerce than the ideals of science.
In Should the Smithsonian and Other Museums Blow Off Big Fossil?, Greg Laden asks whether public institutions should dump the contributions of those venal people in the oil industry that are paying scientists to muddle the public debate over Anthropogenic Global Warming (AGW), or climate change. Such institutions rely heavily on corporate money because public money has dwindled under low taxation, so Laden and his commenters suggest raising taxes.
I wonder, though, if we can actually ever separate public money from corporate money. My opinion is that even Pharaohs, Caesars and Kings probably had to answer to the business interests of their era, and that Premiers, Prime Ministers and Presidents will always do the same. Probably the best we can do is to erect a bit of a firewall between those at the benches and those in the marketplace, and hope the scientists aren’t any worse than the rest of us.