Tag Archives: curiouscat

Science Postercasts

I wrote about SciVee, over a year ago, saying I thought they could become a valuable resource. It has been taking longer to really get going than I thought it would but this new feature, Postercasts, is great. I am glad to see SciVee living up to my high expectation. Keep up the great work SciVee. The experience can still use improvement but this is a great start.

They have provided a tutorial on: How to Synchronize my Poster to my Video. I hope some of our readers try this out.

via: Interactive Virtual Posters

Related: Engineering TVScience WebcastsMagnetic Movie

Wind Power Provided Over 1% of Global Electricity in 2007

graph of global installed wind power capacity

Data from World Wind Energy Association, for installed Mega Watts of global wind power capacity in 2007. 19,696 MW of capacity were added in 2007, bringing the total to 93,849 MW. Europe accounts for 61% of installed capacity, Germany accounts for 24% and the USA 18%.

The graph shows the top 10 producers (with the exceptions of Denmark and Portugal) and includes Japan (which is 13th).

Related: USA Wind Power Installed Capacity 1981 to 2005Wind Power has the Potential to Produce 20% of Electricity by 2030Top 12 Manufacturing Countries in 2007Sails for Modern Cargo ShipsMIT’s Energy ‘Manhattan Project’

More Mysterious Space Phenomenon

One of the things I really hope this blog helps accomplish is to show how science progresses (which explains why I use that tag so often, 3rd most, other popular tags: animals (most used), engineers 2nd, fun and webcasts tied for 4th).

Science is a process of continual learning as curiosity leads us to seek better understanding. On a small scale this can mean a person learning more about knowledge already understood by others. But it also means the scientific community facing new questions and coming up with new explanations for the new questions raised by observations (and testing those new explanations…). Mysterious New ‘Dark Flow’ Discovered in Space

Patches of matter in the universe seem to be moving at very high speeds and in a uniform direction that can’t be explained by any of the known gravitational forces in the observable universe. Astronomers are calling the phenomenon “dark flow.” The stuff that’s pulling this matter must be outside the observable universe, researchers conclude.

They discovered that the clusters were moving nearly 2 million mph (3.2 million kph) toward a region in the sky between the constellations of Centaurus and Vela. This motion is different from the outward expansion of the universe (which is accelerated by the force called dark energy).

“We found a very significant velocity, and furthermore, this velocity does not decrease with distance, as far as we can measure,” Kashlinsky told SPACE.com. “The matter in the observable universe just cannot produce the flow we measure.”

Related: Laws of Physics May Need a RevisionGreat Physics Webcast LecturesChallenging the Science Status QuoParasite Rex

Goldbergian Flash Fits Rube Goldberg Web Site

Intentionally, I hope, the Rube Goldberg Machine Contest web site illustrates how to use needlessly complex engineering to design a tool that fails to follow sensible engineering guidelines. Rather than aiming for well designed usable products, the desire is to produce a machine that sort-of complies with the requirements but in a extremely foolish, convoluted way. Obviously it would be much more sensible to design that web site with html and it would just work simply, easily and quickly for everyone. But flash is the perfect tool to use if you want to promote Goldbergian thinking.

The web site, for example, does display content to a web browser. If that web browser has a flash plugin installed and it is the proper type. And sure the conventions of the web don’t work in this crippled environment but who cares about that when designing Goldbergian web sites. Of course if you actually want to design a good web site such choices would be – lets see, oh yeah, lame. I could link to the contest information – but in good Flash Goldbergian fashion that is not possible with the non-website website they have.

Related: Rube Goldberg Machine ContestRube Goldberg Devices from JapanNASA You Have a Problem340 Years of Royal Society Journals OnlineNSF Engineering Division is ReorganizationHow to Design for the Web

Science Policy Research Virtual Intern

externs.com is another curiouscat.com web site that lists internship opportunities. I am surprised that virtual internships and externships have not grown much more popular in the last 5 years. Scientists and Engineers for America do have such a virtual internship:

Members of the first Scientists and Engineers for America (SEA) virtual intern class can be located anywhere in the world and will work remotely on specific SEA projects. Intern will research the positions elected officials and candidates for office take on science policy issues.

The internship is for between 10 to 20 hours per week and can be done anywhere, as long as you have a computer, internet connection, and telephone. The dates of the internship are flexible accepted on a rolling basis.

Also see the externs.com science internships and engineering internships. If you have an internship you would like included, please add it (there is not cost for the site, listing or using).

Related: Summer Jobs for Smart Young MindsPreparing Computer Science Students for JobsScience and Engineering Scholarships and FellowshipsScientists and Engineers in Congress

500 Year Floods

Why you can get ‘500 year floods’ two years in a row by Anne Jefferson:

Flood probabilities are based on historical records of stream discharge. Let’s use the Iowa River at Marengo, Iowa as an example. It reached a record discharge of 46,600 cubic feet per second* (1320 m3/s) on 12 June. That flow was estimated to have a 500 year recurrence interval, based on 51 years of peak flow records

When you are extrapolating beyond your data by an order of magnitude, the highest points in the dataset start to have a lot of leverage. Let’s imagine that there’s another big flood on the Iowa River next year and we do the same analysis. Now our dataset has 52 points, with the highest being the flood of 2008. When that point is included in the analysis, a discharge of 46,600 cubic feet per second* (1320 m3/s) has a recurrence interval of <150 years (>0.6%). It’s still a darn big flow, but it doesn’t sound quite so biblical anymore.

Urbanization and the adding of impervious surface is one cause of increasing flood peaks, but in Iowa, a more likely culprit is agricultural.

This post is a good explanation that the 500 year flood idea is just way of saying .2% probability (that some people confuse as meaning it can only happen every 500 years). But I actually am more interested in the other factor which is how much estimation is in “500 year prediction.” We don’t have 500 years of data. And the conditions today (I believe) are much more likely to create extreme conditions. So taking comfort in 500 year (.2%), or even 100 year (1% probability) flood “predictions” is dangerous.

It would seem to me, in fact, actually having a 500 year flood actually increases the odds for it happening again (because the data now includes that case which had not been included before). It doesn’t actually increase the likelihood of it happening but the predictions we make are based on the data we have (so given that it happens our previous 500 year prediction is questionable). With a coin toss we know the odds are 50%, getting 3 heads in a row doesn’t convince us that our prediction was bad. And therefore the previous record of heads or tails in the coin toss have no predictive value.

I can’t see why we would think that for floods. With the new data showing a flood, (it seems to me) most any model is likely to show an increased risk (and pretty substantial I would think) of it happening again in the next 100 years (especially in any area with substantial human construction – where conditions could well be very different than it was for our data that is 20, 40… years old). And if we are entering a period of more extreme weather then that will likely be a factor too…

The comments on the original blog post make some interesting points too – don’t miss those.

Related: Two 500-Year Floods Within 15 Years: What are the Odds? USGS – All Models Are Wrong But Some Are Useful by George BoxCancer Deaths – Declining Trend?Megaflood Created the English ChannelSeeing Patterns Where None ExistsDangers of Forgetting the Proxy Nature of DataUnderstanding Data

Protecting the Food Supply

A few weeks ago we posted about Tracking Down Tomato Troubles as another example of the challenges of scientific inquiry. Too often, in the rare instances that science is even discussed in the news, the presentation provides the illusion of simple obvious answers. Instead it is often a very confusing path until the answers are finally found (posts on scientific investigations in action). At which time it often seems obvious what was going on. But to get to the solutions we need dedicated and talented scientists to search for answers.

Now the CDC is saying tomatoes might not be the source of the salmonella after all: CDC investigates possible non-tomato salmonella sources.

Federal investigators retraced their steps Monday as suspicions mount that fresh unprocessed tomatoes aren’t necessarily causing the salmonella outbreak that has sickened hundreds across the USA.

Three weeks after the Food and Drug Administration warned consumers to avoid certain types of tomatoes linked to the salmonella outbreak, people are still falling ill, says Robert Tauxe with the Centers for Disease Control and Prevention. The latest numbers as of Monday afternoon were 851 cases, some of whom fell ill as recently as June 20, says Tauxe, deputy director of the CDC’s division of foodborne diseases.

The CDC launched a new round of interviews over the weekend. “We’re broadening the investigation to be sure it encompasses food items that are commonly consumed with tomatoes,” Tauxe says. If another food is found to be the culprit after tomatoes were recalled nationwide and the produce industry sustained losses of hundreds of millions of dollars, food safety experts say the public’s trust in the government’s ability to track foodborne illnesses will be shattered.

“It’s going to fundamentally rewrite how we do outbreak investigations in this country,” says Michael Osterholm of the Center for Infectious Disease Research and Policy at the University of Minnesota. “We can’t let this investigation, however it might turn out, end with just the answer of ‘What caused it?’ We need to take a very in-depth look at foodborne disease investigation as we do it today.”

I am inclined to believe the FDA is not enough focused on food safety. Perhaps we are not funding it enough, but we sure are spending tons of money on something so I can’t believe more money needs to be spent. Maybe just fewer bills passed (that the politicians don’t even bother to read) with favors to special interests instead of funding to support science and food safety. Or perhaps we are funding enough (though I am skeptical of this contention) and we just are not allowing food safety to get in the way of what special interests want (so we fund plenty for FDA to have managed this much better, to have systems in place that would provide better evidence but they are either prevented from doing so or failed to do so). I am inclined to believe special interests have more sway in agencies like (NASA, EPA, FDA…) than the public good and scientific openness – which is very sad. And, it seems to me, politicians have overwhelmingly chosen not to support more science in places like FDA, CDC, NIH… while increasing federal spending in other areas dramatically.

Related: USDA’s failure to protect the food supplyFDA May Make Decision That Will Speed Antibiotic Drug ResistanceFood safety proposal: throw the bums outThe A to Z Guide to Political Interference in Science

Women Choosing Other Fields Over Engineering and Math

graph of science and engineering degrees by gender in the USA 1966-2005

The graph shows college degrees granted in the USA. This topic sets up one for criticism, but I believe it is more important to examine the data and explore the possible ideas than to avoid anything that might be questioned by the politically correct police. An import factor, to me anyway, is that women are now graduating from college in far higher numbers than men. And in many science fields female baccalaureate graduates outnumber male graduates (psychology [67,000 to 19,000], biology[42,000 to 26,000], anthropology, sociology [20,000 to 8,000]) while men outnumber women in others (math [7,000 to 6,000], engineering [53,000 to 13,000], computer science [39,000 to 11,000], physics [3,000 to 900]).

Data on degrees awarded men and women in the USA in 2005, from NSF*:

Field Bachelors
  
Master’s
  
Doctorate
Women Men Women Men Women Men
Biology 42,283   25,699 4,870   3,229 3,105   3,257
Computer Science 11,235   39,329 5,078   12,742 225   909
Economics 8,141   17,023 1,391   2,113 355   827
Engineering 13,197   52,936 7,607   26,492 1,174   5,215
Geosciences 1,660   2,299 712   973 243   470
Physics 903   3,307 427   1,419 200   1,132
Psychology 66,833   19,103 12,632   3,444 2,264   211
Sociology 20,138   8,438 920   485 343   211
All S&E 235,197   230,806 53,051   66,974 10,533   17,405

What does this all mean? It is debatable, but I think it is very good news for the efforts many have made over the last few decades to open up opportunities for women. I still support efforts to provide opportunities for girls to get started in science and engineering but I think we have reached the day when the biggest concern is giving all kids better math and science primary education (and related extracurricular activities). Also continued focus and effort on the doctorate and professional opportunities for women is warranted.
Continue reading

Medical Study Integrity (or Lack Thereof)

Merck wrote drug studies for doctors

The drug maker Merck drafted dozens of research studies for a best-selling drug, then lined up prestigious doctors to put their names on the reports before publication, according to an article to be published Wednesday in a leading medical journal.

The article, based on documents unearthed in lawsuits over the pain drug Vioxx, provides a rare, detailed look in the industry practice of ghostwriting medical research studies that are then published in academic journals.

“It almost calls into question all legitimate research that’s been conducted by the pharmaceutical industry with the academic physician,” said Ross, whose article, written with colleagues, was published Wednesday in JAMA, The Journal of the American Medical Association, and posted Tuesday on the journal’s Web site.

Merck acknowledged Tuesday that it sometimes hired outside medical writers to draft research reports before handing them over to the doctors whose names eventually appear on the publication. But the company disputed the article’s conclusion that the authors do little of the actual research or analysis.

It is sad that the integrity of journals and scientists is so weak that they leave them open to such charges. The significant presence of the corrupting influence of too much money leaves doubt in my mind that the best science is the goal. Which is very sad. In, Funding Medical Research, I discussed my concern that universities are acting more like profit motivated organizations than science motivated organizations. I am in favor of profit motivated organization (those getting the micro-financing in this link, for example) but those organization should not be trusted to provide honest and balanced opinions they should be expected to provide biased opinions.

If universities (and scientists branding themselves as … at X university) want to be seen as honest brokers of science they can’t behave as though raising money, getting patents… are their main objectives. Many want to be able to get the money and retain the sense of an organization focused on the pursuit of science above all else. Sorry, you can’t have it both ways. You can, and probably should, try stake out some ground in the middle. And for me right now, partially because they fail to acknowledge the extent to which money seems to drive decisions I don’t believe they are trying to be open and honest, instead I get the impression they are leaning more toward trying to market and sell.
Continue reading

Funding Medical Research

Cheap, ‘safe’ drug kills most cancers

It sounds almost too good to be true: a cheap and simple drug that kills almost all cancers by switching off their “immortality”. The drug, dichloroacetate (DCA), has already been used for years to treat rare metabolic disorders and so is known to be relatively safe. It also has no patent, meaning it could be manufactured for a fraction of the cost of newly developed drugs.

Evangelos Michelakis of the University of Alberta in Edmonton, Canada, and his colleagues tested DCA on human cells cultured outside the body and found that it killed lung, breast and brain cancer cells, but not healthy cells. Tumours in rats deliberately infected with human cancer also shrank drastically when they were fed DCA-laced water for several weeks.

DCA attacks a unique feature of cancer cells: the fact that they make their energy throughout the main body of the cell, rather than in distinct organelles called mitochondria. This process, called glycolysis, is inefficient and uses up vast amounts of sugar.

Until now it had been assumed that cancer cells used glycolysis because their mitochondria were irreparably damaged. However, Michelakis’s experiments prove this is not the case, because DCA reawakened the mitochondria in cancer cells. The cells then withered and died

The University of Alberta is raising funds to further the research. Some look at this and indite a funding system that does not support research for human health unless there is profit to be made. Much of the blame seems to go to profit focused drug companies. I can see room for some criticism. But really I think the criticism is misplaced.

The organizations for which curing cancer is the partial aim (rather than making money) say government (partial aim or public health…), public universities (partial aim of science research or medical research…), foundations, cancer societies, private universities… should fund such efforts, if they have merit. Universities have huge research budgets. Unfortunately many see profit as their objective and research as the means to the objective (based on their actions not their claims). These entities with supposedly noble purposes are the entities I blame most, not profit focused companies (though yes, if they claim an aim of health care they I would blame them too).

Now I don’t know what category this particular research falls into. Extremely promising or a decent risk that might work just like hundreds or thousands of other possibilities. But lets look at several possibilities. Some others thoughts on where it falls: Dichloroacetate to enter clinical trials in cancer patients, from a previous post here – Not a Cancer Cure Yet, The dichloroacetate (DCA) cancer kerfuffle, CBC’s ‘The Current’ on dichloroacetate (DCA), Dichloroacetate (DCA) Phase II Trial To Begin (“Like hundreds (if not, thousands) of compounds being tested to treat cancer, DCA was shown by Michelakis’ group earlier this year to slow the growth of human lung tumors in a preclinical rodent model.”).
Continue reading