Tag Archives: scientific inquiry

Monarch Butterflies Use Medicinal Plants

Monarch butterflies eat toxic plants (that they have evolved to tolerate and make the butterflies themselves toxic to predators). They use medicinal plants to treat their offspring for disease, research by Emory biologists shows. When the butterflies are infected by certain parasites the butterflies have a strong preference to lay their eggs on a plant (tropical milkweed) that will help the caterpillar fight the parasite when it eats those leaves (it serves as a drug for them). Their experiments may be the best evidence to date that animals use medication.

Related: Monarch Migration ResearchMonarch Butterfly MigrationEvolution at Work with the Blue Moon Butterfly

Big Bangless and Endless Universe

A new the theory does away with the big bang and dark energy by having space, time and energy and no beginning and no ending.

Big Bang Abandoned in New Model of the Universe

Wun-Yi Shu at the National Tsing Hua University in Taiwan has developed an innovative new description of the Universe in which the roles of time space and mass are related in new kind of relativity.

Shu’s idea is that time and space are not independent entities but can be converted back and forth between each other. In his formulation of the geometry of spacetime, the speed of light is simply the conversion factor between the two. Similarly, mass and length are interchangeable in a relationship in which the conversion factor depends on both the gravitational constant G and the speed of light, neither of which need be constant.

So as the Universe expands, mass and time are converted to length and space and vice versa as it contracts. This universe has no beginning or end, just alternating periods of expansion and contraction. In fact, Shu shows that singularities cannot exist in this cosmos.

It’s easy to dismiss this idea as just another amusing and unrealistic model dreamed up by those whacky comsologists.

That is until you look at the predictions it makes. During a period of expansion, an observer in this universe would see an odd kind of change in the red-shift of bright objects such as Type-I supernovas, as they accelerate away. It turns out, says Shu, that his data exactly matches the observations that astronomers have made on Earth.

That’s not to say Shu’s theory is perfect. Far from it. One of the biggest problems he faces is explaining the existence and structure of the cosmic microwave background, something that many astrophysicists believe to be the the strongest evidence that the Big Bang really did happen. The CMB, they say, is the echo of the Big bang.

How it might arise in Shu’s cosmology isn’t yet clear but I imagine he’s working on it.

Science is useful in letting us understand the world better. But it also is an evolving understanding as we learn more and search for answers to more questions. Many attempts to put forth new ideas and have them gain acceptance are made. Most fail to gain traction. But even many of the ideas that are not accepted are interesting.

Read Cosmological Models with No Big Bang by Wun-Yi Shu (on the wonderful open access arXiv).

Related: Why Wasn’t the Earth Covered in Ice 4 Billion Years Ago, When the Sun was DimmerWhy do we Need Dark Energy to Explain the Observable Universe?The State of Physics

Letting Children Learn – Hole in the Wall Computers

The hole in the wall experiments are exactly the kind of thing I love to lean about. I wrote about them in 2006, what kids can learn.

Research finding from the Hole in the Wall foundation:

Over the 4 year research phase (2000-2004), HiWEL has extensively studied the impact of Learning Stations on children. Hole-in-the-Wall Learning Stations were installed in diverse settings, the impact of interventions was monitored and data was continually gathered, analyzed and interpreted. Rigorous assessments were conducted to measure academic achievement, behaviour, personality profile, computer literacy and correlations with socio-economic indicators.

The sociometric survey found:

  • Self-organizing groups of children who organize themselves into Leaders (experts), Connectors and Novice groups.
  • Leaders and Connectors identified seem to display an ability to connect with and teach other users.
  • Key leaders on receiving targeted intervention, play a key role in bringing about a “multiplier effect in learning” within the community.
  • Often girls are seen to take on the role of Connector, who initiates younger children and siblings (usually novices with little or no exposure to computers) and connects them to the leaders in the group

I believe traditional education is helpful. I believe people are “wired” to learn. They want to learn. We need to create environments that let them learn. We need to avoid crushing the desire to learn (stop de-motivating people).

If you want to get right to talking about the hole in the wall experiments, skip to the 8 minute mark.

Related: Providing Computer to Remote Students in NepalTeaching Through TinkeringKids Need Adventurous PlayScience Toys You Can Make With Your Kids

Mycoremediation and its Applications In Oil Spills

The webcast shows a talk by mycologist Paul Stamets on Bioremediation with Fungi (an Excerpt from Mushrooms as Planetary Healers). In response he to the British Petroleum/Halliburton oil spill he posted a message, Fungi Perfecti: the petroleum problem

Various enzymes (from mushroom mycoremediation) breakdown a wide assortment of hydrocarbon toxins.
..
My work with Battelle Laboratories, in collaboration with their scientists, resulted in TAH’s (Total Aromatic Hydrocarbons) in diesel contaminated soil to be reduced from 10,000 ppm to < 200 ppm in 16 weeks from a 25% inoculation rate of oyster (Pleurotus ostreatus) mycelium, allowing the remediated soil to be approved for use as landscaping soil along highways. [paper]

Aged mycelium from oyster mushrooms (Pleurotus ostreatus) mixed in with ‘compost’ made from woodchips and yard waste (50:50 by volume) resulted in far better degradation of hydrocarbons than oyster mushroom mycelium or compost alone.

Oyster mushrooms producing on oil contaminated soil (1–2% = 10,000–20,000 ppm)… Soil toxicity reduced in 16 weeks to less than ~ 200 ppm, allowing for plants, worms and other species to inhabit whereas control piles remained toxic to plants and worms.

New crop of mushrooms form several weeks later [after contaminating with oil]. The spores released by these mushrooms have the potential – as a epigenetic response – to pre-select new strains more adaptive to this oil-saturated substrate.

I proposed in 1994 that we have Mycological Response Teams (MRTs) in place to react to catastrophic events, from hurricanes to oil spills. We need to preposition composting and mycoremediation centers adjacent to population centers

On a grand scale, I envision that we, as a people, develop a common myco-ecology of consciousness and address these common goals through the use of mycelium. To do so means we need to spread awareness and information. Please spread the word of mycelium.

Related: Saving the World with Science and MushroomsFun FungiThinking Slime Moulds

A Breakthrough Cure for Ebola

A breakthrough cure for Ebola By Steven Salzberg

Last week, in what may be the biggest medical breakthrough of its kind in years, a group of scientists published results in The Lancet describing a completely new type of anti-viral treatment that appears to cure Ebola. They report a 100% success rate, although admittedly the test group was very small, just 4 rhesus monkeys.

This is a breakthrough not only because it may give us a cure for an uncurable, incredibly nasty virus, but also because the same method might work for other viruses, and because we have woefully few effective antiviral treatments. We can treat bacterial infections with antibiotics, but for most viruses, we have either a vaccine or nothing. And a vaccine, wonderful as it is, doesn’t help you after you’re already infected.

The scientists, led by Thomas Geisbert at Boston University, used a relatively new genomics technique called RNA interference to defeat the virus. Here’s how it works.
First, a little background: the Ebola virus is made of RNA, just like the influenza virus. And just like influenza, Ebola has very few genes – only 8. One of its genes, called L protein, is responsible for copying the virus itself. Two others, called VP24 and VP35, interfere with the human immune response, making it difficult for our immune system to defeat the virus.

Geisbert and his colleagues (including scientists from Tekmira Pharmaceuticals and USAMRIID) designed and synthesized RNA sequences that would stick to these 3 genes like glue. How did they do that? We know the Ebola genome’s sequence – it was sequenced way back in 1993. And we know that RNA sticks to itself using the same rules that DNA uses. This knowledge allowed Geisbert and colleagues to design a total of 10 pieces of RNA (called “small interfering RNA” or siRNA) that they knew would stick to the 3 Ebola genes. They also took care to make sure that their sticky RNA would not stick to any human genes, which might be harmful. They packaged these RNAs for delivery by inserting them into nanoparticles that were only 81-85 nanometers across.

Related: Science Explained: RNA InterferenceAmazing Science: RetrovirusesEbola Outbreak in Uganda (Dec 2007)

Bee Colonies Continue to Collapse

The activity to find the causes of Colony Collapse Disorder provides a view into the scientific inquiry process of complex living systems. Finding answers is not easy.

Fears for crops as shock figures from America show scale of bee catastrophe

Disturbing evidence that honeybees are in terminal decline has emerged from the United States where, for the fourth year in a row, more than a third of colonies have failed to survive the winter.

The decline of the country’s estimated 2.4 million beehives began in 2006, when a phenomenon dubbed colony collapse disorder (CCD) led to the disappearance of hundreds of thousands of colonies. Since then more than three million colonies in the US and billions of honeybees worldwide have died and scientists are no nearer to knowing what is causing the catastrophic fall in numbers.

It is estimated that a third of everything we eat depends upon honeybee pollination.

Potential causes range from parasites, such as the bloodsucking varroa mite, to viral and bacterial infections, pesticides and poor nutrition stemming from intensive farming methods.

“We believe that some subtle interactions between nutrition, pesticide exposure and other stressors are converging to kill colonies,” said Jeffery Pettis, of the ARS’s bee research laboratory.

“It’s getting worse,” he said. “The AIA survey doesn’t give you the full picture because it is only measuring losses through the winter. In the summer the bees are exposed to lots of pesticides. Farmers mix them together and no one has any idea what the effects might be.” Pettis agreed that losses in some commercial operations are running at 50% or greater.

High Levels of Miticides and Agrochemicals in North American Apiaries: Implications for Honey Bee Health (open access paper on the topic, March 2010)

The 98 pesticides and metabolites detected in mixtures up to 214 ppm in bee pollen alone represents a remarkably high level for toxicants in the brood and adult food of this primary pollinator. This represents over half of the maximum individual pesticide incidences ever reported for apiaries. While exposure to many of these neurotoxicants elicits acute and sublethal reductions in honey bee fitness, the effects of these materials in combinations and their direct association with CCD or declining bee health remains to be determined.

Related: Solving the Mystery of the Vanishing BeesVirus Found to be One Likely Factor in Bee Colony Colapse DisorderBye Bye Bees

Webcast on Finding the Missing Memristor

Very interesting lecture on finding the missing memristor by R. Stanley Williams. From our post in 2008:

How We Found the Missing Memristor By R. Stanley Williams:

For nearly 150 years, the known fundamental passive circuit elements were limited to the capacitor (discovered in 1745), the resistor (1827), and the inductor (1831). Then, in a brilliant but underappreciated 1971 paper, Leon Chua, a professor of electrical engineering at the University of California, Berkeley, predicted the existence of a fourth fundamental device, which he called a memristor.

Related: Demystifying the Memristorposts on computer sciencevon Neumann Architecture and Bottleneck

Why Wasn’t the Earth Covered in Ice 4 Billion Years Ago – When the Sun was Dimmer

Climate scientists from all over the globe are now able to test their climate models under extreme conditions thanks to Professor Minik Rosing, University of Copenhagen. Rosing has solved one of the great mysteries and paradoxes of our geological past, namely, “Why the earth’s surface was not just one big lump of ice four billion years ago when the Sun’s radiation was much weaker than it is today.” Until now, scientists have presumed that the earth’s atmosphere back then consisted of 30% carbon dioxide (CO2) which ensconced the planet in a protective membrane, thereby trapping heat like a greenhouse.

The faint early sun paradox
In 1972, the late, world famous astronomer Carl Sagan and his colleague George Mullen formulated “The faint early sun paradox. ” The paradox consisted in that the earth’s climate has been fairly constant during almost four of the four and a half billion years that the planet has been in existence, and this despite the fact that radiation from the sun has increased by 25-30 percent.

The paradoxical question that arose for scientists in this connection was why the earth’s surface at its fragile beginning was not covered by ice, seeing that the sun’s rays were much fainter than they are today. Science found one probable answer in 1993, which was proffered by the American atmospheric scientist, Jim Kasting. He performed theoretical calculations that showed that 30% of the earth’s atmosphere four billion years ago consisted of CO2. This in turn entailed that the large amount of greenhouse gases layered themselves as a protective greenhouse around the planet, thereby preventing the oceans from freezing over.

Mystery solved
Now, however, Professor Minik Rosing, from the Natural History Museum of Denmark, and Christian Bjerrum, from the Department of Geography and Geology at University of Copenhagen, together with American colleagues from Stanford University in California have discovered the reason for “the missing ice age” back then, thereby solving the sun paradox, which has haunted scientific circles for more than forty years.

Professor Minik Rosing explains, “What prevented an ice age back then was not high CO2 concentration in the atmosphere, but the fact that the cloud layer was much thinner than it is today. In addition to this, the earth’s surface was covered by water. This meant that the sun’s rays could warm the oceans unobstructed, which in turn could layer the heat, thereby preventing the earth’s watery surface from freezing into ice. The reason for the lack of clouds back in earth’s childhood can be explained by the process by which clouds form. This process requires chemical substances that are produced by algae and plants, which did not exist at the time. These chemical processes would have been able to form a dense layer of clouds, which in turn would have reflected the sun’s rays, throwing them back into the cosmos and thereby preventing the warming of earth’s oceans. Scientists have formerly used the relationship between the radiation from the sun and earth’s surface temperature to calculate that earth ought to have been in a deep freeze during three billion of its four and a half billion years of existence. Sagan and Mullen brought attention to the paradox between these theoretical calculations and geological reality by the fact that the oceans had not frozen. This paradox of having a faint sun and ice-free oceans has now been solved.”

CO2 history iluminated
Minik Rosing and his team have by analyzing samples of 3.8-billion-year-old mountain rock from the world’s oldest bedrock, Isua, in western Greenland, solved the “paradox”.

But more importantly, the analyses also provided a finding for a highly important issue in today’s climate research – and climate debate, not least: whether the atmosphere’s CO2 concentration throughout earth’s history has fluctuated strongly or been fairly stable over the course of billions of years.

“The analyses of the CO2-content in the atmosphere, which can be deduced from the age-old Isua rock, show that the atmosphere at the time contained a maximum of one part per thousand of this greenhouse gas. This was three to four times more than the atmosphere’s CO2-content today. However, not anywhere in the range of the of the 30 percent share in early earth history, which has hitherto been the theoretical calculation. Hence we may conclude that the atmosphere’s CO2-content has not changed substantially through the billions of years of earth’s geological history. However, today the graph is turning upward. Not least due to the emissions from fossil fuels used by humans. Therefore it is vital to determine the geological and atmospheric premises for the prehistoric past in order to understand the present, not to mention the future, in what pertains to the design of climate models and calculations,” underscores Minik Rosing.

Full press release from the University of Copenhagen in Denmark.

Related: Sun Missing It’s SpotsSolar StormsWhy is it Colder at Higher Elevations?Magnetic Portals Connect Sun and Earth

Gravity Emerges from Quantum Information, Say Physicists

Gravity Emerges from Quantum Information, Say Physicists

One of the hottest new ideas in physics is that gravity is an emergent phenomena; that it somehow arises from the complex interaction of simpler things.

perhaps the most powerful idea to emerge from Verlinde’s approach is that gravity is essentially a phenomenon of information.

Over recent years many results in quantum mechanics have pointed to the increasingly important role that information appears to play in the Universe. Some physicists are convinced that the properties of information do not come from the behaviour of information carriers such as photons and electrons but the other way round. They think that information itself is the ghostly bedrock on which our universe is built.

Gravity has always been a fly in this ointment. But the growing realisation that information plays a fundamental role here too, could open the way to the kind of unification between the quantum mechanics and relativity that physicists have dreamed of.

This speculative physics is fascinating. Open access paper: Gravity from Quantum Information.

Related: Does Time ExistQuantum Mechanics Made Relatively Simple PodcastsLaws of Physics May Need a RevisionOpen Science: Explaining Spontaneous Knotting

Statistical Errors in Medical Studies

I have written about statistics, and various traps people often fall into when examining data before (Statistics Insights for Scientists and Engineers, Data Can’t Lie – But People Can be Fooled, Correlation is Not Causation, Simpson’s Paradox). And also have posted about reasons for systemic reasons for medical studies presenting misleading results (Why Most Published Research Findings Are False, How to Deal with False Research Findings, Medical Study Integrity (or Lack Thereof), Surprising New Diabetes Data). This post collects some discussion on the topic from several blogs and studies.

HIV Vaccines, p values, and Proof by David Rind

if vaccine were no better than placebo we would expect to see a difference as large or larger than the one seen in this trial only 4 in 100 times. This is distinctly different from saying that there is a 96% chance that this result is correct, which is how many people wrongly interpret such a p value.

So, the modestly positive result found in the trial must be weighed against our prior belief that such a vaccine would fail. Had the vaccine been dramatically protective, giving us much stronger evidence of efficacy, our prior doubts would be more likely to give way in the face of high quality evidence of benefit.

While the actual analysis the investigators decided to make primary would be completely appropriate had it been specified up front, it now suffers under the concern of showing marginal significance after three bites at the statistical apple; these three bites have to adversely affect our belief in the importance of that p value. And, it’s not so obvious why they would have reported this result rather than excluding those 7 patients from the per protocol analysis and making that the primary analysis; there might have been yet a fourth analysis that could have been reported had it shown that all important p value below 0.05.

How to Avoid Commonly Encountered Limitations of Published Clinical Trials by Sanjay Kaul, MD and and George A. Diamond, MD

Trials often employ composite end points that, although they enable assessment of nonfatal events and improve trial efficiency and statistical precision, entail a number of shortcomings that can potentially undermine the scientific validity of the conclusions drawn from these trials. Finally, clinical trials often employ extensive subgroup analysis. However, lack of attention to proper methods can lead to chance findings that might misinform research and result in suboptimal practice.

Why Most Published Research Findings Are False by John P. A. Ioannidis
Continue reading