Category Archives: Research

Evidence that Refined Carbohydrates Threaten the Heart

More Evidence that Refined Carbohydrates, not Fats, Threaten the Heart

Eat less saturated fat: that has been the take-home message from the U.S. government for the past 30 years. But while Americans have dutifully reduced the percentage of daily calories from saturated fat since 1970, the obesity rate during that time has more than doubled, diabetes has tripled, and heart disease is still the country’s biggest killer. Now a spate of new research, including a meta-analysis of nearly two dozen studies, suggests a reason why: investigators may have picked the wrong culprit. Processed carbohydrates, which many Americans eat today in place of fat, may increase the risk of obesity, diabetes and heart disease more than fat does – a finding that has serious implications for new dietary guidelines expected this year.

Right now, Post explains, the agency’s main message to Americans is to limit overall calorie intake, irrespective of the source. “We’re finding that messages to consumers need to be short and simple and to the point,” he says. Another issue facing regulatory agencies, notes Harvard’s Stampfer, is that “the sugared beverage industry is lobbying very hard and trying to cast doubt on all these studies.”

The medical studies about what food to eat to remain healthy can be confusing but some details are not really in doubt. So while the exact dangers of processed carbohydrates, fat, excess calories and high fructose corn syrup may be in question their is no doubt we, in the USA, are not as healthy as we should be. And food is a significant part of the problem. Eat food, not too much, mostly plants and get enough exercise is good advice.

Related: Statistical Errors in Medical StudiesResearchers Find High-Fructose Corn Syrup Results in More Weight GainThe Calorie DelusionObesity Epidemic Explained, Kind OfActive Amish Avoid Obesity

A single Liter of Seawater Can Hold More Than One Billion Microorganisms

Mat of microbes the size of Greece discovered on seafloor

mighty microbes, which constitute 50 to 90 percent of the oceans’ total biomass, according to newly released data.

These tiny creatures can join together to create some of the largest masses of life on the planet, and researchers working on the decade-long Census of Marine Life project found one such seafloor mat off the Pacific coast of South America that is roughly the size of Greece.

A single liter of seawater, once thought to contain about 100,000 microbes, can actually hold more than one billion microorganisms, the census scientists reported. But these small creatures don’t just live in the water column or on the seafloor. Large communities of microscopic animals have even been discovered more than one thousand meters beneath the seafloor. Some of these deep burrowers, such as loriciferans, are only a quarter of a millimeter long.

“Far from being a lifeless desert, the deep sea rivals such highly diverse ecosystems as tropical rainforests and coral reefs,”

Microbes help to turn atmospheric carbon dioxide into usable carbon, completing about 95 percent of all respiration in the Earth’s oceans…

Related: Iron-breathing Species Isolated in Antarctic for Millions of YearsLife Far Beneath the OceanLife Untouched by the Sun

Non-infectious Prion Protein Linked to Alzheimer’s Disease

‘Harmless’ prion protein linked to Alzheimer’s disease

Non-infectious prion proteins found in the brain may contribute to Alzheimer’s disease, researchers have found.

normal prion proteins produced naturally in the brain interact with the amyloid-β peptides that are hallmarks of Alzheimer’s disease. Blocking this interaction in preparations made from mouse brains halted some neurological defects caused by the accumulation of amyloid-β peptide. It was previously thought that only infectious prion proteins, rather than their normal, non-infectious counterparts, played a role in brain degeneration.

Alzheimer’s disease has long been linked to the build-up of amyloid-β peptides, first into relatively short chains known as oligomers, and then eventually into the long, sticky fibrils that form plaques in the brain. The oligomeric form of the peptide is thought to be toxic, but exactly how it acts in the brain is unknown.

Related: Soil Mineral Degrades the Nearly Indestructible PrionPrion Proteins, Without Genes, Can EvolveClues to Prion Infectivity

Critter Cam: Sea Lion versus Octopus

Octopus vs. Sea Lion – First Ever Video

Sea lions fitted with GPS trackers and a National Geographic Crittercam are taking scientists on amazing journeys to previously unknown marine ‘hot spots.’ These areas are important not only for providing the sea lions’ food, but also for maintaining fish populations.

The Crittercams were deployed at Dangerous Reef in Spencer Gulf, a rocky island the size of a football field, and home to the biggest Australian sea lion colony.

Dr. Page says, “One important discovery is that the sea lions always feed on the sea floor” and they don’t eat open ocean fish, known as pelagic. “This is critical information because the marine parks are being set up to protect sea floor habitats,” a move that the scientists can now confirm will protect critical sea lion resources.

In one of the more spectacular pieces of Crittercam video so far, we can see this female working hard to handle a challenging prey item – a large octopus. Too big to swallow in one gulp, she drags it to the surface where she can breathe while she works at breaking it down into bite-size pieces.

Related: Orcas Create Wave to Push Seal Off IceOctopus Juggling Fellow Aquarium OccupantsWater Buffaloes, Lions and Crocodiles Oh MyCat and Crow Friends

HP Makes Progress on Revolutionary Memristors

H.P. Sees a Revolution in Memory Chip

Memristor-based systems also hold out the prospect of fashioning analog computing systems that function more like biological brains, Dr. Chua said.

“Our brains are made of memristors,” he said, referring to the function of biological synapses. “We have the right stuff now to build real brains.”

In an interview at the H.P. research lab, Stan Williams, a company physicist, said that in the two years since announcing working devices, his team had increased their switching speed to match today’s conventional silicon transistors. The researchers had tested them in the laboratory, he added, proving they could reliably make hundreds of thousands of reads and writes.

That is a significant hurdle to overcome, indicating that it is now possible to consider memristor-based chips as an alternative to today’s transistor-based flash computer memories, which are widely used in consumer devices like MP3 players, portable computers and digital cameras.

“Not only do we think that in three years we can be better than the competitors,” Dr. Williams said. “The memristor technology really has the capacity to continue scaling for a very long time, and that’s really a big deal.”

Related: Demystifying the MemristorHow We Found the Missing MemristorSelf-assembling Nanotechnology in Chip Manufacturing

Why Wasn’t the Earth Covered in Ice 4 Billion Years Ago – When the Sun was Dimmer

Climate scientists from all over the globe are now able to test their climate models under extreme conditions thanks to Professor Minik Rosing, University of Copenhagen. Rosing has solved one of the great mysteries and paradoxes of our geological past, namely, “Why the earth’s surface was not just one big lump of ice four billion years ago when the Sun’s radiation was much weaker than it is today.” Until now, scientists have presumed that the earth’s atmosphere back then consisted of 30% carbon dioxide (CO2) which ensconced the planet in a protective membrane, thereby trapping heat like a greenhouse.

The faint early sun paradox
In 1972, the late, world famous astronomer Carl Sagan and his colleague George Mullen formulated “The faint early sun paradox. ” The paradox consisted in that the earth’s climate has been fairly constant during almost four of the four and a half billion years that the planet has been in existence, and this despite the fact that radiation from the sun has increased by 25-30 percent.

The paradoxical question that arose for scientists in this connection was why the earth’s surface at its fragile beginning was not covered by ice, seeing that the sun’s rays were much fainter than they are today. Science found one probable answer in 1993, which was proffered by the American atmospheric scientist, Jim Kasting. He performed theoretical calculations that showed that 30% of the earth’s atmosphere four billion years ago consisted of CO2. This in turn entailed that the large amount of greenhouse gases layered themselves as a protective greenhouse around the planet, thereby preventing the oceans from freezing over.

Mystery solved
Now, however, Professor Minik Rosing, from the Natural History Museum of Denmark, and Christian Bjerrum, from the Department of Geography and Geology at University of Copenhagen, together with American colleagues from Stanford University in California have discovered the reason for “the missing ice age” back then, thereby solving the sun paradox, which has haunted scientific circles for more than forty years.

Professor Minik Rosing explains, “What prevented an ice age back then was not high CO2 concentration in the atmosphere, but the fact that the cloud layer was much thinner than it is today. In addition to this, the earth’s surface was covered by water. This meant that the sun’s rays could warm the oceans unobstructed, which in turn could layer the heat, thereby preventing the earth’s watery surface from freezing into ice. The reason for the lack of clouds back in earth’s childhood can be explained by the process by which clouds form. This process requires chemical substances that are produced by algae and plants, which did not exist at the time. These chemical processes would have been able to form a dense layer of clouds, which in turn would have reflected the sun’s rays, throwing them back into the cosmos and thereby preventing the warming of earth’s oceans. Scientists have formerly used the relationship between the radiation from the sun and earth’s surface temperature to calculate that earth ought to have been in a deep freeze during three billion of its four and a half billion years of existence. Sagan and Mullen brought attention to the paradox between these theoretical calculations and geological reality by the fact that the oceans had not frozen. This paradox of having a faint sun and ice-free oceans has now been solved.”

CO2 history iluminated
Minik Rosing and his team have by analyzing samples of 3.8-billion-year-old mountain rock from the world’s oldest bedrock, Isua, in western Greenland, solved the “paradox”.

But more importantly, the analyses also provided a finding for a highly important issue in today’s climate research – and climate debate, not least: whether the atmosphere’s CO2 concentration throughout earth’s history has fluctuated strongly or been fairly stable over the course of billions of years.

“The analyses of the CO2-content in the atmosphere, which can be deduced from the age-old Isua rock, show that the atmosphere at the time contained a maximum of one part per thousand of this greenhouse gas. This was three to four times more than the atmosphere’s CO2-content today. However, not anywhere in the range of the of the 30 percent share in early earth history, which has hitherto been the theoretical calculation. Hence we may conclude that the atmosphere’s CO2-content has not changed substantially through the billions of years of earth’s geological history. However, today the graph is turning upward. Not least due to the emissions from fossil fuels used by humans. Therefore it is vital to determine the geological and atmospheric premises for the prehistoric past in order to understand the present, not to mention the future, in what pertains to the design of climate models and calculations,” underscores Minik Rosing.

Full press release from the University of Copenhagen in Denmark.

Related: Sun Missing It’s SpotsSolar StormsWhy is it Colder at Higher Elevations?Magnetic Portals Connect Sun and Earth

Next steps for Google’s Experimental Fiber Network

Think big with a gig: Google’s experimental fiber

Universal, ultra high-speed Internet access will make all this and more possible. We’ve urged the FCC to look at new and creative ways to get there in its National Broadband Plan – and today we’re announcing an experiment of our own.

We’re planning to build and test ultra high-speed broadband networks in a small number of trial locations across the United States. We’ll deliver Internet speeds more than 100 times faster than what most Americans have access to today with 1 gigabit per second, fiber-to-the-home connections. We plan to offer service at a competitive price to at least 50,000 and potentially up to 500,000 people.

Next steps for our experimental fiber network

So what’s next? Over the coming months, we’ll be reviewing the responses to determine where to build. As we narrow down our choices, we’ll be conducting site visits, meeting with local officials and consulting with third-party organizations. Based on a rigorous review of the data, we will announce our target community or communities by the end of the year.

Of course, we’re not going to be able to build in every interested community — our plan is to reach a total of at least 50,000 and potentially up to 500,000 people with this experiment. Wherever we decide to build, we hope to learn lessons that will help improve Internet access everywhere.

This is another great idea from Google. Not only to push forward the much poorer internet connectivity those in the USA have than other countries but it will hopefully lead to some real engineering breakthroughs. And it is a smart move to increase Google’s potential income – a better internet experience (for users) will likely help Google quite a bit.

Related: Google’s Underwater CablesGoogle Server Hardware DesignChina’s Next Generation InternetNet Neutrality: This is serious

Taste Cells in the Stomach and Intestine

Stomach’s Sweet Tooth

Taste, scientists are discovering, is a whole-body sensation. There are taste cells in the stomach, intestine and, evidence suggests, the pancreas, colon and esophagus. These sensory cells are part of an ancient battalion tasked with guiding food choices

Newly discovered taste cells in the gut appear to send a “prepare for fuel” message to the body, a finding that may explain a link between diet soda and diabetes risk.

The gut’s taste cells appear to be built from the same machinery as the taste cells of the tongue, the structures of which scientists have only recently nailed down. Taste cells interact with what are called “tastants” via receptors, specialized proteins that protrude from cell walls and bind to specific molecules drifting by. When a tastant binds to a receptor, it signals other molecules that, in the mouth, immediately send an “accept” or “reject” message to the brain.

Gut taste cells appear to regulate, in part, secretion of insulin, a hormone crucial for telling body tissues whether they should tap newly arrived glucose or valuable stored fat for energy.

Related: Waste from Gut Bacteria Helps Host Control WeightSurprising New Diabetes DataReducing Risk of Diabetes Through ExerciseDrinking Soda and Obesity

Engineering Mosquitoes to be Flying Vaccinators

Mosquitoes Engineered Into Flying Vaccinators by Emily Singer

Researchers in Japan have transformed mosquitoes into vaccine-carrying syringes by genetically engineering the insects to express the vaccine for leishmaniasis–a parasitic disease transmitted by the sandfly–in their saliva. According to a study in Insect Molecular Biology, mice bitten by these mosquitoes produced antibodies against the parasite. It’s not yet clear whether the immune response was strong enough to protect against infection.

“Following bites, protective immune responses are induced, just like a conventional vaccination but with no pain and no cost,” said lead researcher Shigeto Yoshida, from the Jichi Medical University in JapanYoshida, in a press release from the journal. “What’s more continuous exposure to bites will maintain high levels of protective immunity, through natural boosting, for a life time. So the insect shifts from being a pest to being beneficial.”

Researchers consider the project more of a proof of principle experiment than a viable public health option, at least for now.

Very cool.

Related: New and Old Ways to Make Flu VaccinesTreated Mosquito Nets Prevent Malariare-engineering mosquito so they cannot carry disease

Statistical Errors in Medical Studies

I have written about statistics, and various traps people often fall into when examining data before (Statistics Insights for Scientists and Engineers, Data Can’t Lie – But People Can be Fooled, Correlation is Not Causation, Simpson’s Paradox). And also have posted about reasons for systemic reasons for medical studies presenting misleading results (Why Most Published Research Findings Are False, How to Deal with False Research Findings, Medical Study Integrity (or Lack Thereof), Surprising New Diabetes Data). This post collects some discussion on the topic from several blogs and studies.

HIV Vaccines, p values, and Proof by David Rind

if vaccine were no better than placebo we would expect to see a difference as large or larger than the one seen in this trial only 4 in 100 times. This is distinctly different from saying that there is a 96% chance that this result is correct, which is how many people wrongly interpret such a p value.

So, the modestly positive result found in the trial must be weighed against our prior belief that such a vaccine would fail. Had the vaccine been dramatically protective, giving us much stronger evidence of efficacy, our prior doubts would be more likely to give way in the face of high quality evidence of benefit.

While the actual analysis the investigators decided to make primary would be completely appropriate had it been specified up front, it now suffers under the concern of showing marginal significance after three bites at the statistical apple; these three bites have to adversely affect our belief in the importance of that p value. And, it’s not so obvious why they would have reported this result rather than excluding those 7 patients from the per protocol analysis and making that the primary analysis; there might have been yet a fourth analysis that could have been reported had it shown that all important p value below 0.05.

How to Avoid Commonly Encountered Limitations of Published Clinical Trials by Sanjay Kaul, MD and and George A. Diamond, MD

Trials often employ composite end points that, although they enable assessment of nonfatal events and improve trial efficiency and statistical precision, entail a number of shortcomings that can potentially undermine the scientific validity of the conclusions drawn from these trials. Finally, clinical trials often employ extensive subgroup analysis. However, lack of attention to proper methods can lead to chance findings that might misinform research and result in suboptimal practice.

Why Most Published Research Findings Are False by John P. A. Ioannidis
Continue reading