Tag Archives: Research

Data Analysts Captivated by R’s Power

Data Analysts Captivated by R’s Power

data mining has entered a golden age, whether being used to set ad prices, find new drugs more quickly or fine-tune financial models. Companies as diverse as Google, Pfizer, Merck, Bank of America, the InterContinental Hotels Group and Shell use it.

Close to 1,600 different packages reside on just one of the many Web sites devoted to R, and the number of packages has grown exponentially. One package, called BiodiversityR, offers a graphical interface aimed at making calculations of environmental trends easier.

Another package, called Emu, analyzes speech patterns, while GenABEL is used to study the human genome. The financial services community has demonstrated a particular affinity for R; dozens of packages exist for derivatives analysis alone. “The great beauty of R is that you can modify it to do all sorts of things,” said Hal Varian, chief economist at Google. “And you have a lot of prepackaged stuff that’s already available, so you’re standing on the shoulders of giants.”

R first appeared in 1996, when the statistics professors Ross Ihaka and Robert Gentleman of the University of Auckland in New Zealand released the code as a free software package. According to them, the notion of devising something like R sprang up during a hallway conversation. They both wanted technology better suited for their statistics students, who needed to analyze data and produce graphical models of the information. Most comparable software had been designed by computer scientists and proved hard to use.

R is another example of great, free, open source software. See R packages for Statistics for Experimenters.

via: R in the news

Related: Mistakes in Experimental Design and InterpretationData Based Decision Making at GoogleFreeware Math ProgramsHow Large Quantities of Information Change Everything

Does the Earth Have Two Cores?

Did Earth’s Twin Cores Spark Plate Tectonics?

a new theory aims to rewrite it all by proposing the seemingly impossible: Earth has not one but two inner cores.

The idea stems from an ancient, cataclysmic collision that scientists believe occurred when a Mars-sized object hit Earth about 4.45 billion years ago. The young Earth was still so hot that it was mostly molten, and debris flung from the impact is thought to have formed the moon.

Haluk Cetin and Fugen Ozkirim of Murray State University think the core of the Mars-sized object may have been left behind inside Earth, and that it sank down near the original inner core. There the two may still remain, either separate or as conjoined twins, locked in a tight orbit.

Their case is largely circumstantial and speculative, Cetin admitted. “We have no solid evidence yet, and we’re not saying 100 percent that it still exists,” he said. “The interior of Earth is a very hard place to study.”

The ancient collision is a widely accepted phenomenon. But most scientists believe the incredible pressure at the center of the planet would’ve long since pushed the two cores into each other.

I must say two cores seems very far-fetched to me. But it is another great example of the scientific discovery process and an interesting idea.

Related: Himalayas GeologyDrilling to the Center of the EarthCurious Cat Science and Engineering Search

Bacteria Offer Line of Attack on Cystic Fibrosis

Bacteria Offer Line of Attack on Cystic Fibrosis

MIT researchers have found that the pigments responsible for the blue-green stain of the mucus that clogs the lungs of cystic fibrosis (CF) patients are primarily signaling molecules that allow large clusters of the opportunistic infection agent, Pseudomonas aeruginosa, to organize themselves into structured communities.

P. aeruginosa appears as a classic opportunistic infection, easily shrugged off by healthy people but a grave threat to those with CF, which chokes the lungs of its victims with sticky mucus.

“We have a long way to go before being able to test this idea, but the hope is that if survival in the lung is influenced by phenazine — or some other electron-shuttling molecule or molecules — tampering with phenazine trafficking might be a potential way to make antibiotics more effective,” said Newman, whose lab investigates how ancestral bacteria on the early Earth evolved the ability to metabolize minerals.

Related: Clues to Prion InfectivityRiver Blindness Worm Develops Resistance to DrugsBeneficial Bacteria

HHMI on Science 2.0: Information Revolution

The Howard Hughes Medical Institute does great things for science and for open science. They have an excellent article in their HHMI Bulletin – Science 2.0: You Say You Want a Revolution?

Cross-pollination among research disciplines is in fact at the core of many other popular science blogs. Michael Eisen, an HHMI investigator at the University of California, Berkeley, is an avid blog reader who particularly enjoys John Hawks’ site on paleoanthropology, genetics, and evolution. A recent post there discussed a new sequencing of Neanderthal mitochondrial DNA. “It’s like a conduit into another whole world,” says Eisen.

The current extreme of collaboration via Science 2.0 is OpenWetWare.org. Begun in 2003 by Austin Che, who was then a computer science and biology graduate student at MIT, this biological-engineering Website uses the wiki model to showcase protocols and lab books: everything is open and can be edited by any of its 4,000 members.

“Most publishers wish open access would go away,” says Brown. It won’t. Major research-funding organizations, including NIH, HHMI, and the Wellcome Trust, now require their grantees to post their findings on openaccess Websites such as PLoS or PubMed Central within 12 months of publication in traditional journals. Publishers are pushing back, however, and in September, the House Judiciary Committee began holding hearings on whether the federal government should be allowed to require grantees to submit accepted papers to a free archive.

Related: $600 Million for Basic Biomedical Research from HHMITracking the Ecosystem Within UsPublishers Continue to Fight Open Access to Science$1 Million Each for 20 Science Educators

Rate of Cancer Detected and Death Rates Declines

Declines in Cancer Incidence and Death Rates in report from the National Cancer Institute and CDC:

“The drop in incidence seen in this year’s Annual Report is something we’ve been waiting to see for a long time,” said Otis W. Brawley, M.D., chief medical officer of the American Cancer Society (ACS). “However, we have to be somewhat cautious about how we interpret it, because changes in incidence can be caused not only by reductions in risk factors for cancer, but also by changes in screening practices. Regardless, the continuing drop in mortality is evidence once again of real progress made against cancer, reflecting real gains in prevention, early detection, and treatment.”

According to a U.S. Surgeon General’s report, cigarette smoking accounts for approximately 30 percent of all cancer deaths, with lung cancer accounting for 80 percent of the smoking-attributable cancer deaths. Other cancers caused by smoking include cancers of the oral cavity, pharynx, larynx, esophagus, stomach, bladder, pancreas, liver, kidney, and uterine cervix and myeloid leukemia.

Diagnoses Of Cancer Decline

The analysis found that the overall incidence of cancer began inching down in 1999, but not until the data for 2005 were analyzed was it clear that a long-term decline was underway. “The take-home message is that many of the things we’ve been telling people to do to be healthy have finally reached the point where we can say that they are working,” Brawley said. “These things are really starting to pay off.”

Brawley and others cautioned, however, that part of the reduction could be the result of fewer people getting screened for prostate and breast cancers. In addition, the rates at which many other types of cancer are being diagnosed are still increasing

Some experts said the drop was not surprising, noting that it was primarily the result of a fall in lung cancer because of declines in smoking that occurred decades ago. They criticized the ongoing focus on detecting and treating cancer and called for more focus on prevention.

“The whole cancer establishment has been focused on treatment, which has not been terribly productive,” said John C. Bailar III, who studies cancer trends at the National Academy of Sciences. “I think what people should conclude from this is we ought to be putting most of our resources where we know there has been progress, almost in spite of what we’ve done, and stop this single-minded focus on treatment.”

Related: Is there a Declining Trend in Cancer Deaths?Cancer Deaths Increasing, Death Rate DecreasingLeading Causes of Deathposts discussing cancerNanoparticles to Battle Cancer
Continue reading

Rat Brain Cells, in a Dish, Flying a Plane

Adaptive Flight Control With Living Neuronal Networks on Microelectrode Arrays (open access paper) by Thomas B. DeMarse and Karl P. Dockendorf Department of Biomedical Engineering, University of Florida

investigating the ability of living neurons to act as a set of neuronal weights which were used to control the flight of a simulated aircraft. These weights were manipulated via high frequency stimulation inputs to produce a system in which a living neuronal network would “learn” to control an aircraft for straight and level flight.

A system was created in which a network of living rat cortical neurons were slowly adapted to control an aircraft’s flight trajectory. This was accomplished by using high frequency stimulation pulses delivered to two independent channels, one for pitch, and one for roll. This relatively simple system was able to control the pitch and roll of a simulated aircraft.

When Dr. Thomas DeMarse first puts the neurons in the dish, they look like little more than grains of sand sprinkled in water. However, individual neurons soon begin to extend microscopic lines toward each other, making connections that represent neural processes. “You see one extend a process, pull it back, extend it out — and it may do that a couple of times, just sampling who’s next to it, until over time the connectivity starts to establish itself,” he said. “(The brain is) getting its network to the point where it’s a live computation device.”

To control the simulated aircraft, the neurons first receive information from the computer about flight conditions: whether the plane is flying straight and level or is tilted to the left or to the right. The neurons then analyze the data and respond by sending signals to the plane’s controls. Those signals alter the flight path and new information is sent to the neurons, creating a feedback system.

“Initially when we hook up this brain to a flight simulator, it doesn’t know how to control the aircraft,” DeMarse said. “So you hook it up and the aircraft simply drifts randomly. And as the data come in, it slowly modifies the (neural) network so over time, the network gradually learns to fly the aircraft.”

Although the brain currently is able to control the pitch and roll of the simulated aircraft in weather conditions ranging from blue skies to stormy, hurricane-force winds, the underlying goal is a more fundamental understanding of how neurons interact as a network, DeMarse said.

Related: Neural & Hybrid Computing Laboratory @ University of Florida – UF Scientist: “Brain” In A Dish Acts As Autopilot, Living ComputerRoachbot: Cockroach Controlled RobotNew Neurons in Old Brainsposts on brain researchViruses and What is LifeGreat Self Portrait of Astronaut Engineer

Demystifying the Memristor

Demystifying the memristor

The memristor — short for memory resistor – could make it possible to develop far more energy-efficient computing systems with memories that retain information even after the power is off, so there’s no wait for the system to boot up after turning the computer on. It may even be possible to create systems with some of the pattern-matching abilities of the human brain.

By providing a mathematical model for the physics of a memristor, the team makes possible for engineers to develop integrated circuit designs that take advantage of its ability to retain information.

“This opens up a whole new door in thinking about how chips could be designed and operated,” Williams says.

Engineers could, for example, develop a new kind of computer memory that would supplement and eventually replace today’s commonly used dynamic random access memory (D-RAM). Computers using conventional D-RAM lack the ability to retain information once they are turned off. When power is restored to a D-RAM-based computer, a slow, energy-consuming “boot-up” process is necessary to retrieve data stored on a magnetic disk required to run the system.

Related: How Computers Boot UpNanotechnology Breakthroughs for Computer ChipsDelaying the Flow of Light on a Silicon ChipSelf-assembling Nanotechnology in Chip Manufacturing

New Supercomputer for Science Research

photo of Jaguar Supercomputer

“Jaguar is one of science’s newest and most formidable tools for advancement in science and engineering,” said Dr. Raymond L. Orbach, DOE.s Under Secretary for Science. The new capability will be added to resources available to science and engineering researchers in the USA.

80 percent of the Leadership Computing Facility resources are allocated through the United States Department of Energy’s Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program, a competitively selected, peer reviewed process open to researchers from universities, industry, government and non-profit organizations. Scientists and engineers at DOE’s Oak Ridge National Laboratory are finding an increasing variety of uses for the Cray XT system. A recent report identified 10 breakthroughs in U.S. computational science during the past year. Six of the breakthroughs involved research conducted with the Jaguar supercomputer, including a first-of-its-kind simulation of combustion processes that will be used to design more efficient automobile engines. Read the computational science report. Read full press release.

ORNL’s Jaguar fastest computer for science research

Jaguar will be used for studies of global climate change, as well as development of alternative energy sources and other types of scientific problem-solving that previously could not be attempted.

Zacharia said ORNL’s Jaguar was upgraded by adding 200 Cray XT5 cabinets – loaded with AMD quadcore processors and Cray SeaStar interconnects – to the computer’s existing 84 Cray XT4 cabinets. The combined machine resulted in the new standard for computational science.

The peak operating speed is apparently just below that of Los Alamos National Laboratory’s IBM Roadrunner system, which is designed for 1.7 petaflops. But the Jaguar reportedly has triple the memory of Roadrunner and much broader research potential.

Because the Jaguar has come online sooner than expected, Zacharia said an alert was sent to top U.S. scientists inviting them to apply for early access to the Oak Ridge computer. Their scientific proposals will be reviewed on an accelerated timetable, he said.

The peak capability of 1.64 petaflops is attributed to 1.384 petaflops from the new Cray XT5, combined with 0.266 petaflops from the existing Cray XT4 system, Zacharia said.

How fast is a quadrillion calculations per second? “One way to understand the speed is by analogy,” Zacharia said recently. “It would take the entire population of the Earth (more than 6 billion people), each of us working a handheld calculator at the rate of one second per calculation, more than 460 years to do what Jaguar at a quadrillion can do in one day.”

Related: National Center for Computational Sciences at ORNL site on Jaguar (photo from here) – Open Science Computer GridDonald Knuth, Computer ScientistSaving FermilabNew Approach Builds Better Proteins Inside a ComputerDoes the Data Deluge Make the Scientific Method Obsolete?

Bacteria and Efficient Food Digestion

Gut Bacteria May Cause And Fight Disease, Obesity

“We’re all sterile until we’re born,” says Glenn Gibson, a microbiologist at the University of Reading in Britain. “We haven’t got anything in us right up until the time we come into this big, bad, dirty world.”

But as soon as we pass out of the birth canal, when we are fetched by a doctor’s hands, placed in a hospital crib, put on our mother’s breast, when we drag a thumb across a blanket and stick that thumb in our mouths, when we swallow our first soft food, we are invaded by all sorts of bacteria. Once inside, they multiply – until the bacteria inside us outnumber our human cells.

University of Chicago immunologist Alexander Chervonsky, with collaborators from Yale University, recently reported that doses of the right stomach bacteria can stop the development of type 1 diabetes in lab mice. “By changing who is living in our guts, we can prevent type 1 diabetes,” he told The Wall Street Journal.

The bottom line: We now have two sets of genes to think about – the ones we got from our parents and the ones of organisms living inside us. Our parents’ genes we can’t change, but the other set? Now that is one of the newest and most exciting fields in cell biology.

Follow link with related podcast: Gut bacteria may cause and fight, disease, obesity. This whole area of the ecosystem within us and our health I find fascinating. And I fall for confirmation bias on things like becoming inefficient at converting food to energy as a way reduce obesity.

You could have two people sitting down to a bowl of cheerios, they could each eat the same number of cheerios but because of a difference in their gut bacteria one will get more calories than the other.

.

They then gave an example of the difference being 95 calories versus 99 calories. Hardly seems huge but it would add up. Still that is a less amazing difference than I was expecting.

Related: Energy Efficiency of DigestionWaste from Gut Bacteria Helps Host Control WeightObesity Epidemic Partially ExplainedForeign Cells Outnumber Human Cells in Our Bodies

Symptom of America’s Decline in Particle Physics

Land Of Big Science

Probing more deeply than ever before into the stuff of the universe requires some big hardware. It also requires the political will to lavish money on a project that has no predictable practical return, other than prestige and leadership in the branch of science that delivered just about every major technology of the past hundred years.

Those advances came, in large measure, from the United States. The coming decades may be different.

A third of the scientists working at the LHC hail from outside the 20 states that control CERN. America has contributed 1,000 or so researchers, the largest single contingent from any non-CERN nation.

The U.S. contribution amounts to $500 million—barely 5 percent of the bill. The big bucks have come from the Europeans. Germany is picking up 20 percent of the tab, the British are contributing 17 percent, and the French are giving 14 percent.

The most worrying prospect is that scientists from other countries, who used to flock to the United States to be where the action is, are now heading to Europe instead.

This is a point I have made before. The economic benefits of investing in science are real. The economic benefits of having science and engineering centers of excellence in your country are real. That doesn’t mean you automatically gain economic benefit but it is a huge advantage and opportunity if you act intelligently to make it pay off.

Related: Invest in Science for a Strong EconomyDiplomacy and Science ResearchAsia: Rising Stars of Science and EngineeringBrain Drain Benefits to the USA Less Than They Could Beposts on funding science explorationposts on basic researchAt the Heart of All Matter