Archive | Science RSS for this section

Flowers for America

Over the last few years I have blogged and tweeted about shows from HBO Now, Youtube TV, Acorn, Britbox, and briefly the Mhz channel. In response to the pandemic, we dropped all those pay channels and have been streaming free channels like Roku, TUBI, FilmRise, etc. I rewatched UFO, a paranoid 1970 sci-fi series by the team that had produced marionette series like Fireball XL5 and Thunderbirds, and later Space 1999. I am rewatching Merlin, too, which presented the Arthurian legend as a mix of adolescent comedy and melodrama.

I had seen parts of the first Hunger Games film, but TUBI had the entire series for nine more days, so we started watching those, and comparisons to the current economic landscape are inescapable. We’ve also been watching the Genius of the Modern World series on Netflix, which my stepson has not dropped. The first two episodes featured Karl Marx and Friedrich Nietzche. We watched Nicole Kidman in Bewitched, and last night my wife found a 1990 Cinderella-type flick called, If the Shoe Fits, starring post-sex tape Rob Lowe and post-rhinoplasty Jennifer Grey. That was terrible, but I owed her for sitting through Marx and Nietzche.

One film we enjoyed was Flowers for Algernon, a 2000 TV movie starring Matthew Modine. I saw Charly in theaters when it first came out in 1968, and found it very moving. Cliff Robertson played Charlie Gordon on TV in 1961, and again in the film. I thought Cliff Robertson deserved his Oscar, but there was one scene where he plays the developing Charlie Gordon being “groovy” that was tough to watch. Later I ran across the story in a scifi anthology. I hadn’t initially thought of it as a science fiction tale, but increasing his intelligence got Dr Morbius in all sorts of trouble in Forbidden Planet, and there were the Outer Limits episodes, Expanding Human and The Sixth Finger, where Skip Homeier and David McCallum ran afoul of their experiments in increasing intelligence.

Daniel Keyes was an experienced author and editor of pulp magazine and comic book science fiction, horror and fantasy, but also spent some time teaching English to special needs students. He reportedly developed a synopsis, Brainstorm, at the request of Galaxy Science Fiction into Flowers for Algernon, whose editor then requested a happier ending. Keyes published in The Magazine of Fantasy & Science Fiction instead, and won a Hugo award for Best Short Story. Keyes later expanded it to a novel, which I have not read, which shared a Nebula Award, and was nominated for a Hugo. Again some publishers requested the happier ending, but Harcourt Brace published as written.

Anyway, we enjoyed Matthew Modine’s performance. There were staging differences from Charly, but the story was essentially the same, and at the end we had to wonder how we would deal with a loss of intelligence. We each have relatives who are dealing with this in a very real way.

As I drifted off to sleep last night, it occurred to me that our nation-state is facing an impending decline of intelligence with far less grace than Charlie Gordon. We’ve witnessed an experiment in which a surfeit of natural resources – taken from around the globe – fueled a massively prosperous middle class, but the experiment is being carefully wound down, and we are being made to forget all the rights and prerogatives we once took for granted.

Hey, Venus

I started to write this piece quite a while ago, but got distracted. I was reminded of it while chatting with a coworker who is a big time Star Wars and scifi fan.

Having gotten tired of insanely high cable tv bills, I cut back to internet-only several years ago. But I did have an apartment-style antenna, which pulls in broadcast stations like Comet TV for free. Comet shows all sorts of low-budget sci-fi, fantasy and horror films, many of which I have blogged about here already. A year or so ago, I happened across a 1960 spacefaring film called First Spaceship to Venus (FSTV).

FSTV’s plot was that an artifact found within an asteroid implied belligerent intentions on the part of intelligent beings on Venus. Earth’s scientists organize a truly international crew – African, American, Chinese, French, German, Indian, Japanese and Russian – to investigate. This diverse crew predated Uhura, Sulu and Chekov on Star Trek by five years. Unfortunately I fell asleep halfway through and woke up towards the end. But I was intrigued.

Some research revealed that the film’s original German title translated to The Silent Star, and was based on Stanislaw Lem’s novel, The Astronauts, or Astronauci. Lem wrote on the other side of the iron curtain, which explains why I had never heard of him during my peak sci fi reading years. Lem is famous for Solaris, which spawned three movies, but many of his other works are still difficult to find translated from Polish to English.

I also found that the East German and Polish co-production of The Silent Star (SS) had been heavily edited for American audiences. Crew nationalities were changed and all references to the US atomic bombing of Japan had been removed. That sort of piqued my curiosity. Amazon was no help, but DEFA, an East German film club at U Mass Amherst, offered a DVD in the original German with English subtitles.

I stayed awake this time, and found that SS was a very solid space opera with an antiwar, antinuke message. It wasn’t as flashy as Forbidden Planet, but comparable in quality to The Angry Red Planet.

Sometime later Comet showed Voyage to the Planet of Prehistoric Women, from 1968. Even the title sounded cheesy, but I watched it. Scenes of a serious space voyage to Venus were interspersed with scenes of Mamie van Doren and other pretty blonde women lounging on a seaside wearing tight white slacks and clamshells over their breasts. These native Venusians were supposedly mentally monitoring and challenging the male intruders, but they never actually came into contact with each other.

I had to look this one up, and found that one of Peter Bogdanovich’s first jobs was to remake the 1962 Soviet film, Planeta Bur, or Planet of Storms, into something they could show at American drive-in theatres. So Peter airbrushed out the Soviet logos, and inserted all the blondes. Amazon did have Planet of Storms, in Russian with English subtitles in a bundle with two other Soviet films, A Dream Come True and The Sky Calls. All of these arrived just in time for my Christmas vacation, so my wife got to watch them, too. Wasn’t she happy!

Planeta Bur was a very solid space flick, again, not too different in tone from The Angry Red Planet. It was interesting in that the crew’s robot saved crew member’s lives, but was ultimately unwilling to sacrifice itself to save them again. The Sky Calls was about a race between a government ship and a corporate ship to be the first to get to Mars. The corporates are in the lead, but falter, and the crew is saved when the cosmonauts do the right thing and rescue the corporates instead of going for the glory. 1963’s A Dream Come True (Mechte Navstrechu) involves cosmonauts going to rescue aliens from another system stranded on Mars.

At the time I would have seen these cosmonauts as the Enemy, but after all this time I could see that they mostly had the same hopes and dreams for technology and the future that we did in the US.

Less Sugar, High Anxiety

I recently posted about the effects of demineralized water on my equanimity. But lack of minerals may have been only part of my problem. A few months ago, I began eliminating sweets from my diet, which helped me reach my goal. For years I had been rewarding myself at lunches and dinners with cookies, cakes, chocolate bars or ice cream, or some combination thereof. I bought and ate the best organic sweets, but they still contained a great deal of sugar. Statistics from The Diabetes Council indicate that US citizens consume over 126 grams of sugar per day.

A theory goes that we prefer sweetness because it indicates consumable carbohydrates in ripe fruit while sourness indicates unripe and bitter indicates spoiled or poisonous fruit. Refined sugar is, of course, very sweet, and it is difficult to find prepared foods that do not contain refined sugars such as high fructose corn syrup. As described in The Conversation, it is easy to get addicted to high levels of sugar in your diet:

Like drugs, sugar spikes dopamine release in the nucleus accumbens. Over the long term, regular sugar consumption actually changes the gene expression and availability of dopamine receptors in both the midbrain and frontal cortex. Specifically, sugar increases the concentration of a type of excitatory receptor called D1, but decreases another receptor type called D2, which is inhibitory. Regular sugar consumption also inhibits the action of the dopamine transporter, a protein which pumps dopamine out of the synapse and back into the neuron after firing.

In short, this means that repeated access to sugar over time leads to prolonged dopamine signalling, greater excitation of the brain’s reward pathways and a need for even more sugar to activate all of the midbrain dopamine receptors like before. The brain becomes tolerant to sugar – and more is needed to attain the same “sugar high.”

After the dopamine receptors have become less available, signalling from normal levels of sugar consumption fails to “reach” the dopamine receptors – until you reacclimate. In the meantime, you can suffer the symptoms of dopamine deficiency. Medical News Today offers a long list of symptoms, but in myself I noticed:

trouble sleeping or disturbed sleep
feeling inexplicably sad or tearful
mood swings
feeling hopeless
feeling guilt-ridden
feeling anxious

I initially chalked these symptoms up to low zinc in my drinking water, but they may have been from low dopamine, or from both. Fortunately I seem to be acclimating, but it was a strange couple of months.

Gut Bacteria, Obesity

In 1977, National Lampoon parodied Scientific American as “Scienterrific American.” I think they were on to something. I’ve written a few posts about whether we should trust scientists, whether scientists can trust each other, etc. Sadly, some scientists will publish what they are paid to publish, and some will publish whatever makes headlines, so they can continue to work. Some of their results are not reproducible, which means they aren’t really doing science. The charitable view is that eventually the scientific method will sort out the scientific from the scienterrific, but a lot of us were ingesting PFOA from Teflon long before we were told that it was a carcinogen.

Recent headlines advised that the FDA had banned sales of many antibacterial soaps, containing any of over a dozen chemicals, because “the risks outweigh the benefits.”

Studies in animals have shown that triclosan and triclocarban can disrupt the normal development of the reproductive system and metabolism, and health experts warn that their effects could be the same in humans. The chemicals were originally used by surgeons to wash their hands before operations, and their use exploded in recent years as manufacturers added them to a variety of products, including mouthwash, laundry detergent, fabrics and baby pacifiers. The Centers for Disease Control and Prevention found the chemicals in the urine of three-quarters of Americans.

That New York Times article notes that a trade group, The American Cleaning Institute, opposes the FDA ruling, and claims to have studies that support their opposition. I’m sure they do.

Scientific American (the real one) has posted an excerpt of a book, Let Them Eat Dirt: Saving Your Child from an Oversanitized World, written by two microbiologists: B. Brett Finlay, Ph.D., and Marie-Claire Arrieta, Ph.D.

Finlay and Arrieta point out that while antibiotics have certainly saved many, many of us from dying young from an infectious disease, they have also changed our environment in more subtle ways. Besides the fear about developing unstoppable superbugs, we may be making ourselves susceptible to a raft of non-infectious diseases. One concern is the use of antibiotics in meat, another is the use of antibiotics in early childhood:

While these studies didn’t prove that antibiotics directly cause obesity, the consistency in these correlations, as well as those observed in livestock, prompted scientists to have a closer look. What they found was astonishing. A simple transfer of intestinal bacteria from obese mice into sterile (“germ-free”) mice made these mice obese, too! We’ve heard before that many factors lead to obesity: genetics, high-fat diets, high-carb diets, lack of exercise, etc. But bacteria—really? This raised skepticism among even the biggest fanatics in microbiology, those of us who tend to think that bacteria are the center of our world. However, these types of experiments have been repeated in several different ways and the evidence is very convincing: the presence and absence of certain bacteria early in life helps determine your weight later in life. Even more troubling is the additional research that shows that altering the bacterial communities that inhabit our bodies affects not just weight gain and obesity, but many other chronic diseases in which we previously had no clue that microbes might play a role.

Slowing of Growth

In How Demographics Rule the Global Economy, the Wall Street Journal offers seven interesting articles – Population Implosion, The End of Cheap Labor, Manufacturing Bust, Girl’s Life, Gender Gap, Promise of Youth, and Aging Gracefully – on how demographics may affect the world economy, particularly the growth paradigm.

Ever since the global financial crisis, economists have groped for reasons to explain why growth in the U.S. and abroad has repeatedly disappointed, citing everything from fiscal austerity to the euro meltdown. They are now coming to realize that one of the stiffest headwinds is also one of the hardest to overcome: demographics.

Next year, the world’s advanced economies will reach a critical milestone. For the first time since 1950, their combined working-age population will decline, according to United Nations projections, and by 2050 it will shrink 5%. The ranks of workers will also fall in key emerging markets, such as China and Russia. At the same time the share of these countries’ population over 65 will skyrocket.

There are a great many charts, some of which may be accurate, others of which may be skewed to satisfy the WSJ editorial view. The authors repeat the usual claims that Malthus definitively predicted a Malthusian catastrophe, which I discussed in Dissing Malthus Again, and that Ehrlich and the Club of Rome were wrong to be wary of overpopulation. They take a shot at Alvin Hansen, a Keynesian who advocated government spending to get us out of the Great Depression when he should have known that World War II would do the trick.

Probably the biggest takeaway is that the West will skew older and older while the far more populous Asia skews younger but so decidedly male that brides are already hard to find. Only Africa seems to be on track to continued population growth:

Simply put: A baby boom will lift the poorest continent on Earth into the center of global affairs. Africa will soon become the world’s most reliable source of new life: of college graduates, young workers and budding consumers.

It should be noted that the WSJ article does not consider the effects of climate change, protracted proxy wars, or refugee relocation in these expectations.

First Peoples

A week after posting The Ugly Little Boy, I ran across First Peoples on PBS. There are five episodes: Americas, Asia, Africa, Australia and Europe. I find it worth watching, though I expect that in ten or twenty years some of the theories presented will be superseded as more discoveries are made.

Wall to Wall Television includes brief, illustrative scenes of both archaic and early modern humans, and to my eyes, they seem to have tried hard to cast appropriate-looking actors. East African modern humans, like Omo-1, are played by dark-skinned African-looking men and women, while the Neanderthals are somewhat lighter-skinned with bushy hair and heavy prosthetic facial features. Eva of Naharon, found in the Yucatan, looks like a dark Latina woman and the woman from Tam Pa Ling, found in Northern Laos, looks like a dark Eurasian woman. Oase Boy, found in the Carpathian Mountains, looks like a modern Caucasian. The Clovis Makers in Northwest America are shown just about as light-skinned as present day native Americans, while Kennewick Man (the Ancient One) is somewhat darker.

While discussing how interbreeding between modern and archaic species may have occurred, some scenes show a small group homo sapiens nervously moving through a landscape, then meeting up with a group of homo erectus and sharing some food. Later another group of sapiens meets up with homo neanderthalis and shares a campfire. Such meetings did (sometimes) happen between European trappers and native Americans, so it is possible that archaic and modern humans met peacefully, traded goods and either intermarried willingly or sold wives.

It is also possible, though, that one group may have wiped out another and taken women and children as captives. Or perhaps one group may have raided another for the express purpose of stealing women. But watching dark-skinned invaders overwhelm lighter-skinned tribes would certainly be fodder for race-baiting in today’s racially-charged atmosphere.

Speaking of Kennewick Man, after nineteen years of study, it seems about time for the Corps of Engineers to let the Umatilla, or some other tribe, bury him with some dignity.

The Ugly Little Boy

Way back before the internets, Isaac Asimov wrote, The Ugly Little Boy, which he included in his anthology Nine Tomorrows. Robert Silverberg later expanded the short story into a novel, which I have not read. In 1977, Barry Morse and Kate Reid starred in a TV version, which is supposed to be very faithful to the short story and is available on youtube.

The boy was a neanderthal (or neandertal) child, brought to, and kept in, the future at great expense of energy by a corporation for scientific research. In the years after the story was written, I read that one scientist claimed we probably couldn’t tell a well-dressed neanderthal apart from anyone else on the street, but knew that when the lay person heard, “neanderthal” they saw a dim but muscular caveman with a sloping forehead. And that is how most popular culture has portrayed them, one example being Jean M Auel’s Clan of the Cave Bear books and the 1986 film starring tall, pale, blonde Darryl Hannah as a Cro-Magnon, adopted by a tribe of stocky, swarthy (but not black-skinned), black-haired Neanderthals.

But if one were to recast Clan of the Cave Bear based on the latest information, one should cast light-skinned people as the Neanderthals, and a taller, darker-skinned woman (perhaps Rosario Dawson) as Ayla the Homo Sapiens Sapiens or Early European Modern Human (EEMH). It is now suggested that populations of Homo Neanderthalensis had already adapted to Northern climates over some three or four hundred thousand years, and that the still dark-skinned Homo Sapiens benefited by acquiring those traits through interbreeding as they displaced the older species. [It is also counter-suggested that the neanderthal DNA remains from before the two species diverged from Homo Erectus.]

It is currently thought that humans (except those strictly descended from sub-Saharan Africans) have between 1% to 4% of neanderthal DNA and that some Melanesians and Australian Aborigines have Denisovan DNA as well. In other words, most of us humans are actually ugly little boys and girls, too.

Can Our Scientists Believe Each Other?

In Nautilus, Philip Ball writes, The Trouble With Scientists:

… In 2005, medical science was shaken by a paper with the provocative title “Why most published research findings are false.” Written by John Ioannidis, a professor of medicine at Stanford University, it didn’t actually show that any particular result was wrong. Instead, it showed that the statistics of reported positive findings was not consistent with how often one should expect to find them. As Ioannidis concluded more recently, “many published research findings are false or exaggerated, and an estimated 85 percent of research resources are wasted.

… the problems of false findings often begin with researchers unwittingly fooling themselves: they fall prey to cognitive biases, common modes of thinking that lure us toward wrong but convenient or attractive conclusions. …

Psychologist Brian Nosek of the University of Virginia says that the most common and problematic bias in science is “motivated reasoning”: We interpret observations to fit a particular idea. Psychologists have shown that “most of our reasoning is in fact rationalization,” he says. In other words, we have already made the decision about what to do or to think, and our “explanation” of our reasoning is really a justification for doing what we wanted to do—or to believe—anyway. Science is of course meant to be more objective and skeptical than everyday thought—but how much is it, really?

Science is supposed to be grounded in reality by following the scientific method, and its safety valve is the peer review process. If the author of a paper is wrong, the ideal is that another scientist will point out the error, and the author will revise or withdraw his theory. Ball notes that the peer review process has flaws, and discusses Nosek’s proposals to fix it, but doesn’t address corporate control of the scientists. When a great deal of money hinges on the results, we find scientists attacking each other’s credibility, with some quietly taking a great deal of money to endorse an industry-friendly position.

Witness ongoing political and media debates around competing studies on climate change, whether certain injections caused autism, the long term effects of nuclear fission powerplants, whether glyphosate is a carcinogen, and what constitutes a proper diet. Even evolution is still up for debate in our public school system.

Can We Not Trust Our Scientists?

Last weekend, I waited out a cold in bed, and watched the last half of a 1957 scifi flick, The 27th Day, which I had seen as a boy and vaguely remembered. The plot was simple enough. For their own reasons, aliens snatched up an American reporter (Gene Barry), a British woman, a Chinese girl, a German physicist and a Soviet soldier. (The females had no careers because this was 1957 and women just stayed home, or something.) To each they gave three capsules that could eliminate all human life within a 1500 mile radius of a given coordinate. The capsules would only function for 27 days, probably because the more advanced a culture, the faster its tech becomes obsolete. The aliens then sent them home, but shortly afterwards announced the situation worldwide. To their credit all five tried to avoid letting the weapons be used. The poor Chinese girl killed herself while the British woman sensibly dumped her capsules into the channel then less sensibly flew to LA to join Gene Barry in hiding. The Soviets needed torture, sodium pentathol and a good bit of lying to get the capsules from the soldier. The physicist happened to be in LA, too, and was being held by a US government that hadn’t yet adopted torture as a matter of course.

As it turned out, those pesky Soviets were waiting until the last minute to cleanse North America of capitalists, but like many movie scientists (and Spock and eventually Sheldon Cooper), the German physicist was also brilliant outside his chosen field. He analyzed some alien markings on the capsules and decided that he could use them to kill only the bad people. Luckily he didn’t screw up that alien conjugation, or he might have killed only the good people. Or the left-handers. In any case he killed a bunch of people whose only crime was being an Enemy of Freedom with no due process. How Randian.

In the movies there are always those good scientists who heroically risk everything to save the world and those evil scientists who risk destroying the world because they like science better than people. In the real world, most scientists spend a lot of time at the bench and test their stuff eight ways from Sunday before even writing a paper, but they are often funded by venal people that are more devoted to the necessities of commerce than the ideals of science.

In Should the Smithsonian and Other Museums Blow Off Big Fossil?, Greg Laden asks whether public institutions should dump the contributions of those venal people in the oil industry that are paying scientists to muddle the public debate over Anthropogenic Global Warming (AGW), or climate change. Such institutions rely heavily on corporate money because public money has dwindled under low taxation, so Laden and his commenters suggest raising taxes.

I wonder, though, if we can actually ever separate public money from corporate money. My opinion is that even Pharaohs, Caesars and Kings probably had to answer to the business interests of their era, and that Premiers, Prime Ministers and Presidents will always do the same. Probably the best we can do is to erect a bit of a firewall between those at the benches and those in the marketplace, and hope the scientists aren’t any worse than the rest of us.

Will Climate Get Even Weirder?

Even though we just lived through the warmest calendar year, the warmest twelve month period, the warmest January and probably the warmest February on record, the most entrenched deniers cite the arctic conditions this winter in the Eastern US as proof that there is no, “global warming.” Even so, easily observable weather events of the last few years – superstorms, torrential rains, mudslides, derechos, heat waves, droughts – are slowly beginning to turn the tide of public opinion on Climate Change, or perhaps Global Weirding. But according to a paper in Science Magazine, we’re in for a lot worse very soon. I don’t have a subscription to Science Magazine, and the abstract is just abstract, but here is a slightly edited version of the Editor’s summary:

Atlantic and Pacific multidecadal oscillations and Northern Hemisphere temperatures
Byron A. Steinman, Michael E. Mann, Sonya K. Miller

Which recent climate changes have been forced by greenhouse gas emissions, and which have been natural fluctuations of the climate system? Steinman et al. combined observational data and a large collection of climate models to assess the Northern Hemisphere climate over the past 150 years … At various points in time, the Pacific Multidecadal Oscillation (PMO) and the Atlantic Multidecadal Oscillation (AMO) have played particularly large roles in producing temperature trends. Their effects have combined to cause the apparent pause in warming at the beginning of the 21st century, known as the warming “hiatus.” This pause is projected to end in the near future as temperatures resume their upward climb.

On Science Blogs, Greg Laden discusses the article and shows some helpful charts. The paper notes that Pacific Ocean temperatures have been low, but are due to swing back up, and since the Pacific is so large, it will drive the whole system temperature up. Even with the Pacific trending low we have seen record warming and bizarre weather. What happens next?

Study author Michael Mann told me, “The PMO appears to be very close to a turning point, based on the historical pattern. So we don’t expect it to continue to plunge downward. We expect a turning point soon.” In his summary of the work in Real Climate, Mann notes that “the most worrying implication of our study [is] that the “false pause” may simply have been a cause for false complacency, when it comes to averting dangerous climate change.”

There won’t be any averting. There will be a great deal of death, loss, hardship and regret, followed by half-measures and finger-pointing.

Update 20150301: Scientific American, The Pause in Global Warming Is Finally Explained