Friday, November 6, 2009

Survival of the Weakest- Why Neanderthals went extinct

Published Oct 29, 2009
From the magazine issue dated Nov 9, 2009
Sharon Begley

Thanks to recent discoveries that they were canny hunters, clever toolmakers, and probably endowed with the gift of language, Neanderthals have overcome some of the nastier calumnies hurled at them, especially that they were the "dumb brutes of the North," as evolutionary ecologist Clive Finlayson describes their popular image. But they have never managed to shake the charge that their extinction 30,000 years ago, when our subspecies of Homo sapiens replaced them in their European home, was their own dumb fault. Modern humans mounted a genocidal assault on them, goes one explanation, triumphing through superior skills. Moderns drove them into extinction through greater evolutionary fitness, says another, especially the moderns' greater intelligence or social advances like the sexual division of labor.


Winners—of prehistory no less than history—get to write the textbooks. So it is no surprise that we, the children of the humans who replaced Neanderthals, "portray ourselves in the role of victors and reduce the rest [of the human lineage] to the lower echelons of vanquished," Finlayson writes. "To accept our existence as the product of chance requires a large dose of humility." But in a provocative new book, The Humans Who Went Extinct: Why Neanderthals Died Out and We Survived, he argues that chance is precisely what got us here. "A slight change of fortunes and the descendants of the Neanderthals would today be debating the demise of those other people that lived long ago," he argues.
Evolutionary biologists have long recognized the role serendipity plays in which species thrive and which wither on the Darwinian vine. Without the asteroid impact 65 million years ago, for instance, mammals would not have spread so soon into almost every ecological niche on Earth (dinosaurs were in the way). Yet when the subject strikes as close to home as why our ancestors survived and Neanderthals did not, scientists have resisted giving chance a starring role, preferring to credit the superiority of ancient H. sapiens. Both are descendants of Homo erectus: some spread across Eurasia beginning 1.8 million years ago and evolved into Neanderthal by 300,000 years ago, and others evolved in Africa, becoming anatomically modern by 200,000 years ago and reaching Europe some 45,000 years ago.


These arrivistes are often portrayed as technologically and culturally more advanced, with their bone and ivory (not just stone) tools and weapons, their jewelry making and cave painting—the last two evidence of symbolic thought. Finlayson has his doubts. Neanderthals may have painted, too (but on perishable surfaces); they were no slouches as toolmakers; and studies of their DNA show they had the same genes for speech that we do. "They survived for nearly 300,000 years," Finlayson says by phone from Gibraltar, where he is director of the Gibraltar Museum. "That modern humans got to Australia before they penetrated Europe suggests that Neanderthals held them off for millennia. That suggests they weren't that backward."

Instead, moderns were very, very lucky—so lucky that Finlayson calls what happened "survival of the weakest." About 30,000 years ago, the vast forests of Eurasia began to retreat, leaving treeless steppes and tundra and forcing forest animals to disperse over vast distances. Because they evolved in the warm climate of Africa before spreading into Europe, modern humans had a body like marathon runners, adapted to track prey over such distances. But Neanderthals were built like wrestlers. That was great for ambush hunting, which they practiced in the once ubiquitous forests, but a handicap on the steppes, where endurance mattered more. This is the luck part: the open, African type of terrain in which modern humans evolved their less-muscled, more-slender body type "subsequently expanded so greatly" in Europe, writes Finlayson. And that was "pure chance."

Because Neanderthals were not adept at tracking herds on the tundra, they had to retreat with the receding woodlands. They made their last stand where pockets of woodland survived, including in a cave in the Rock of Gibraltar. There, Finlayson and colleagues discovered in 2005, Neanderthals held on at least 2,000 years later than anywhere else before going extinct, victims of bad luck more than any evolutionary failings, let alone any inherent superiority of their successors.

Tuesday, November 3, 2009

What does a Smart Brain Look Like?

November 2009
Scientific American Mind
By: Richard J. Haier

A new neuroscience of intelligence is revealing that not all brains work in the same way

  • Brain structure and metabolic efficiency may underlie individual differences in intelligence, and imaging research is pinpointing which regions are key players.
  • Smart brains work in many different ways. Women and men who have the same IQ show different underlying brain architectures.
  • The latest research suggests that an individual’s pattern of gray and white matter might underlie his or her specific cognitive strengths and weaknesses

We all know someone who is not as smart as we are—and someone who is smarter. At the same time, we all know people who are better or worse than we are in a particular area or task, say, remembering facts or performing rapid mental math calculations. These variations in abilities and talents presumably arise from differences among our brains, and many studies have linked certain very specific tasks with cerebral activity in localized areas. Answers about how the brain as a whole integrates activity among areas, however, have proved elusive. Just what does a “smart” brain look like?

Now, for the first time, intelligence researchers are beginning to put together a bigger picture. Imaging studies are uncovering clues to how neural structure and function give rise to individual differences in intelligence. The results so far are confirming a view many experts have had for decades: not all brains work in the same way. People with the same IQ may solve a problem with equal speed and accuracy, using a different combination of brain areas. [For more on IQ and intelligence, see “
Rational and Irrational Thought: The Thinking That IQ Tests Miss,” by Keith E. Stanovich]

Men and women show group average differences on neuroimaging measures, as do older and younger groups, even at the same level of intelligence. But newer studies are demonstrating that individual differences in brain structure and function, as they relate to intelligence, are key—and the latest studies have exposed only the tip of the iceberg. These studies hint at a new definition of intelligence, based on the size of certain brain areas and the efficiency of information flow among them. Even more tantalizing, brain scans soon may be able to reveal an individual’s aptitude for certain academic subjects or jobs, enabling accurate and useful education and career counseling. As we learn more about intelligence, we will better understand how to help individuals fulfill or perhaps enhance their intellectual potential and success.

For 100 years intelligence research relied on pencil-and-paper testing for metrics such as IQ. Psychologists used statistical methods to characterize the different components of intelligence and how they change over people’s lifetimes. They determined that virtually all tests of mental ability, irrespective of content, are positively related to one another—that is, those who score high on one test tend to score high on the others. This fact implies that all tests share a common factor, which was dubbed g, a general factor of intelligence. The g factor is a powerful predictor of success and is the focus of many studies. [For more on g, see “
Solving the IQ Puzzle,” by James R. Flynn; Scientific American Mind, October/November 2007.]

In addition to the g factor, psychologists also have established other primary components of intelligence, including spatial, numerical and verbal factors, reasoning abilities known as fluid intelligence, and knowledge of factual information, called crystallized intelligence. But the brain mechanisms and structures underlying g and the other factors could not be inferred from test scores or even individuals with brain damage and thus remained hidden.

The advent of neuroscience techniques about 20 years ago finally offered a way forward. New methods, particularly neuroimaging, now allow a different approach to defining intelligence based on physical properties of the brain. In 1988 my colleagues and I at the University of California, Irvine, conducted one of the first studies to use such techniques. Using positron-emission tomography (PET), which produces images of metabolism in the brain by detecting the amount of low-level radioactive glucose used by neurons as they fire, we traced the brain’s energy use while a small sample of volunteers solved nonverbal abstract reasoning problems on a test called the Raven’s Advanced Progressive Matrices.

This test is known to be a good indicator of g, so we were hoping to answer the question of where general intelligence arises in the brain by determining which areas showed increased activation while solving the test problems. To our surprise, greater energy use (that is, increased glucose metabolism) was associated with poorer test performance. Smarter people were using less energy to solve the problems—their brains were more efficient.

The next obvious question was whether energy efficiency can arise through practice. In 1992 we used PET before and after subjects learned the computer game Tetris (a fast paced visuospatial puzzle), and we found less energy use in several brain areas after 50 days of practice and increased skill. The data suggest that over time the brain learns what areas are not necessary for better performance, and activity in those areas diminishes—leading to greater overall efficiency. Moreover, the individuals in the study with high g showed more brain efficiency after practice than the people with lower g.

Thursday, October 29, 2009

Naked Mole Rat Wins the War on Cancer

By Jocelyn Kaiser
ScienceNOW Daily News
26 October 2009

With its wrinkled skin and bucked teeth, the naked mole rat isn't going to win any beauty contests. But the burrowing, desert rodent is exceptional in another way: It doesn't get cancer. The naked mole rat's cells hate to be crowded, it turns out, so they stop growing before they can form tumors. The details could someday lead to a new strategy for treating cancer in people.

In search of clues to aging, cell biologists Vera Gorbunova, Andrei Seluanov, and colleagues at the University of Rochester have been comparing rodents that vary in size and life span, from mice to beavers. The naked mole rat stands out because it's small yet can live more than 28 years--seven times as long as a house mouse. Resistance to cancer could be a major factor; whereas most laboratory mice and rats die from the disease, it has never been observed in naked mole rats.

Gorbunova's team looked at the mole rat's cells for an answer. Normal human and mouse cells will grow and divide in a petri dish until they mash tightly against one another in a single, dense layer--a mechanism known as "contact inhibition." Naked mole rat cells are even more sensitive to their neighbors, the researchers found. The cells stop growing as soon as they touch. The strategy likely helps keep the rodents cancer-free, as contact inhibition fails in cancerous cells, causing them to pile up.


The reason, the researchers discovered, is that naked mole rat cells rely on two proteins--named p27Kip1 and p16Ink4a--to stop cell growth when they touch, whereas human and mouse cells rely mainly on p27Kip1. "They use an additional checkpoint," says Gorbunova, whose study appears online today in the Proceedings of the National Academy of Sciences (PNAS). When the team mutated the naked mole rat cells so that they grew much closer together than they had before, levels of p16Ink4a dropped.

The naked mole rat's kind of cancer prevention may prove relevant to humans because the same genes are involved, says Brown University cancer biologist John Sedivy. The rat's defenses "evolved separately but use the same nuts and bolts," he says. Sedivy writes in an accompanying commentary in PNAS that it may be possible to "tweak the entire network [of tumor-suppressing pathways] to develop new prevention strategies."

The next step, Gorbunova says, is to find other proteins and molecules that make up this new contact inhibition pathway. One obstacle is that little is known about the naked mole rat's genes. The critter has been proposed for genome sequencing but so far has been turned down. "I hope Vera's study will put the naked mole rate higher up in the queue," says George Martin, a researcher who studies aging and a professor emeritus at the University of Washington.

Tuesday, October 27, 2009

Secrets of frog killer laid bare

By Richard Black Environment correspondent,
BBC News website

Scientists have unravelled the mechanism by which the fungal disease chytridiomycosis kills its victims.

The fungus is steadily spreading through populations of frogs and other amphibians worldwide, and has sent some species extinct in just a few years.

Researchers now report in the journal Science that the fungus kills by changing the animals' electrolyte balance, resulting in cardiac arrest.

The finding is described as a "key step" in understanding the epidemic.

Karen Lips, one of the world authorities on the spread of chytridiomycosis, said the research was "compelling".

"They've done an incredible amount of work, been very thorough, and I don't think anybody will have problems with this.

"We suspected something like this all along, but it's great to know this is in fact what is happening," the University of Maryland professor told BBC News.

Skin deep

Amphibian skin plays several roles in the animals' life.

Most species can breathe through it, and it is also used as a membrane through which electrolytes such as sodium and potassium are exchanged with the outside world.

The mainly Australian research group took skin samples from healthy and diseased green tree frogs, and found that these compounds passed through the skin much less readily when chytrid was present.

Samples of blood and urine from infected frogs showed much lower sodium and potassium concentrations than in healthy animals - potassium was down by half.

In other animals including humans, this kind of disturbance is known to be capable of causing cardiac arrest.

The scientists also took electrocardiogram recordings of the frogs' hearts in the hours before death; and found changes to the rhythm culminating in arrest.

Drugs that restore electrolyte balance brought the animals a few hours or days of better health, some showing enough vigour to climb out of their bowls of water; but all died in the end.

Grail quest

Lead scientist Jamie Voyles, from James Cook University in Townsville, said the next step was to look for the same phenomenon in other species.

"This is lethal across a broad range of hosts, whether terrestrial or aquatic, so it's really important to look at what's happening in other susceptible amphibians," she said.

Another step will be to examine how the chytrid fungus (Batrachochytrium dendrobatidis - Bd) impairs electrolyte transfer.

"What this work doesn't tell us is the mechanism by which chytrid causes this problem with sodium," said Matthew Fisher from Imperial College London.

"It could be that Bd is excreting a toxin, or it could be causing cell damage. This causative action is actually the 'holy grail' - so that's another obvious next step."

The finding is unlikely to plot an immediate route to ways of preventing or treating or curing the disease in the wild.

Curing infected amphibians in captivity is straightforward using antifungal chemicals; but currently there is no way to tackle it outside.

Various research teams are exploring the potential of bacteria that occur naturally on the skin of some amphibians, and may play a protective role.


Understanding the genetics of how Bd disrupts electrolyte balance might lead to more precise identification of protective bacteria, suggested Professor Lips, and so eventually play a role in curbing the epidemic.

Gene therapy transforms eyesight of 12 born with rare defect

A single injection in a patient's eye brings 'astounding' results. The findings may offer hope for those with macular degeneration and retinitis pigmentosa.

By Thomas H. Maugh II
October 25, 2009

Pennsylvania researchers using gene therapy have made significant improvements in vision in 12 patients with a rare inherited visual defect, a finding that suggests it may be possible to produce similar improvements in a much larger number of patients with retinitis pigmentosa and macular degeneration.


The team last year reported success with three adult patients, an achievement that was hailed as a major accomplishment for gene therapy. They have now treated an additional nine patients, including five children, and find that the best results are achieved in the youngest patients, whose defective retinal cells have not had time to die off.

The youngest patient, 9-year-old Corey Haas, was considered legally blind before the treatment began. He was confined largely to his house and driveway when playing, had immense difficulties in navigating an obstacle course and required special enlarging equipment for books and help in the classroom.

Today, after a single injection of a gene-therapy product in one eye, he rides his bike around the neighborhood, needs no assistance in the classroom, navigates the obstacle course quickly and has even played his first game of softball.
The results are "astounding," said Stephen Rose, chief scientific officer of Foundation Fighting Blindness, which supported the work but was not involved directly. "The big take-home message from this is that every individual in the group had improvement… and there were no safety issues at all."

The study "holds great promise for the future" and "is appealing because of its simplicity," wrote researchers from the Nijmegen Medical Center in the Netherlands in an editorial accompanying the report, which was published online Saturday by the journal Lancet.

The 12 patients had Leber's congenital amaurosis, which affects about 3,000 people in the United States and perhaps 130,000 worldwide. Victims are born with severely impaired vision that deteriorates until they are totally blind, usually in childhood or adolescence. There is no treatment.Leber's is a good candidate for gene therapy because most of the visual apparatus is intact, particularly at birth and in childhood. Mistakes in 13 different genes are known to cause it, but all 12 of the patients suffered a defect in a gene called RPE65. This gene produces a vitamin A derivative that is crucial for detecting light.

About five children are born each year in the United States with that defect, which was chosen because researchers at the Children's Hospital of Philadelphia and the University of Pennsylvania School of Medicine had cloned the gene, making copies available for use.

The study, led by Dr. Katherine A. High, Dr. Albert M. Maguire and Dr. Jean Bennett of those two institutions, enrolled five people in the United States, five from Italy and two from Belgium. Five were children, and the oldest was 44.
The good copy of the RPE65 gene was inserted into a defanged version of a human adenovirus. The engineered virus then invaded retinal cells and inserted the gene into the cells' DNA.

Maguire used a long, thin needle to insert the preparation into the retina of the worst eye in each of the patients. Within two weeks, the treated eyes began to become more sensitive to light, and within a few more weeks, vision began to improve. The younger the patients were, the better they responded. That was expected, Bennett said, because similar results had been observed in dogs and rodents.

By both objective and subjective measures, vision improved for all the patients. They were able to navigate obstacle courses, read eye charts and perform most of the tasks of daily living. The improvement has now persisted for as long as two years.

The children who were treated "are now able to walk and play just like any normally sighted child," Maguire said.

Bennett noted that the oldest patient in the trial, a mother, had not been able to walk down the street to meet her children at school. "Now she can. She also achieved her primary goal, which was to see her daughter hit a home run."

There are clear limitations to the study. The patients' vision was not corrected to normal because of the damage that had already been done to the retina, and only one eye was treated.

"The big elephant in the room is: Can you treat the other eye?" Rose said.
The foundation will put more funding into the research "to make sure that if you go back and treat the other eye, it won't ablate the positive results in the first eye due to an immune reaction or something else."

Researchers also have not optimized the dosage of the adenovirus used to carry the gene into the eye. Those issues will be studied in Phase 2, a larger clinical trial that they hope to begin soon.

Meanwhile, the team has begun treating some patients at the University of Iowa.
Researchers also hope they will be able to translate the results to other congenital conditions using different genes.

Leber's is one form of retinitis pigmentosa, which affects an estimated 100,000 Americans.

The findings might be applicable to macular degeneration, which affects an estimated 1.25 million Americans and is the major cause of visual impairment in the elderly.

Friday, October 2, 2009

World’s oldest human-linked skeleton found


‘Ardi’ predates Lucy by a million years, changes scientific view of origins

By Randolph E. Schmid
updated 6:23 p.m. CT, Thurs., Oct . 1, 2009

WASHINGTON - The story of humankind is reaching back another million years with the discovery of “Ardi,” a hominid who lived 4.4 million years ago in what is now Ethiopia.


J.H. Matternes
An artist's rendering shows Ardipithecus ramidus as it might have looked in life.


The 110-pound, 4-foot female roamed forests a million years before the famous Lucy, long studied as the earliest skeleton of a human ancestor.
This older skeleton reverses the common wisdom of human evolution, said anthropologist C. Owen Lovejoy of Kent State University.
Rather than humans evolving from an ancient chimplike creature, the new find provides evidence that chimps and humans evolved from some long-ago common ancestor — but each evolved and changed separately along the way.
“This is not that common ancestor, but it’s the closest we have ever been able to come,” said Tim White, director of the Human Evolution Research
Center at the University of California, Berkeley.
The lines that evolved into modern humans and living apes probably shared an ancestor 6 million to 7 million years ago, White said in a telephone interview.
But Ardi has many traits that do not appear in modern-day African apes, leading to the conclusion that the apes evolved extensively since we shared that last common ancestor.
A study of Ardi, under way since the first bones were discovered in 1994, indicates the species lived in the woodlands and could climb on all fours along tree branches, but the development of their arms and legs indicates they didn’t spend much time in the trees. And they could walk upright, on two legs, when on the ground.
Formally dubbed Ardipithecus ramidus — which means root of the ground ape — the find is detailed in 11 research papers published Thursday by the journal Science.
“This is one of the most important discoveries for the study of human evolution,” said David Pilbeam, curator of paleoanthropology at Harvard’s Peabody Museum of Archaeology and Ethnology.
“It is relatively complete in that it preserves head, hands, feet and some critical parts in between. It represents a genus plausibly ancestral to Australopithecus — itself ancestral to our genus Homo,” said Pilbeam, who was not part of the research teams.
Scientists assembled the skeleton from 125 pieces.

The area where "Ardi" was found is rich in sites where the fossils of human ancestors have been found.
Lucy, also found in Africa, thrived a million years after Ardi and was of the more humanlike genus Australopithecus.
“In Ardipithecus we have an unspecialized form that hasn’t evolved very far in the direction of Australopithecus. So when you go from head to toe, you’re seeing a mosaic creature that is neither chimpanzee, nor is it human. It is Ardipithecus,” said White.
White noted that Charles Darwin, whose research in the 19th century paved the way for the science of evolution, was cautious about the last common ancestor between humans and apes.
“Darwin said we have to be really careful. The only way we’re really going to know what this last common ancestor looked like is to go and find it. Well, at 4.4 million years ago we found something pretty close to it,” White said. “And, just like Darwin appreciated, evolution of the ape lineages and the human lineage has been going on independently since the time those lines split, since that last common ancestor we shared.”
J.H. Matternes
An artist's rendering shows Ardipithecus ramidus as it might have looked in life.
Some details about Ardi in the collection of papers:


  • Ardi was found in Ethiopia’s Afar Rift, where many fossils of ancient plants and animals have been discovered. Findings near the skeleton indicate that at the time it was a wooded environment. Fossils of 29 species of birds and 20 species of small mammals were found at the site.

  • Geologist Giday WoldeGabriel of Los Alamos National Laboratory was able to use volcanic layers above and below the fossil to date it to 4.4 million years ago.

  • Ardi’s upper canine teeth are more like the stubby ones of modern humans than the long, sharp, pointed ones of male chimpanzees and most other primates. An analysis of the tooth enamel suggests a diverse diet, including fruit and other woodland-based foods such as nuts and leaves.

  • Paleoanthropologist Gen Suwa of the University of Tokyo reported that Ardi’s face had a projecting muzzle, giving her an ape-like appearance. But it didn’t thrust forward quite as much as the lower faces of modern African apes do. Some features of her skull, such as the ridge above the eye socket, are quite different from those of chimpanzees. The details of the bottom of the skull, where nerves and blood vessels enter the brain, indicate that Ardi’s brain was positioned in a way similar to modern humans, possibly suggesting that the hominid brain may have been already poised to expand areas involving aspects of visual and spatial perception.

  • Ardi’s hand and wrist were a mix of primitive traits and a few new ones, but they don’t include the hallmark traits of the modern tree-hanging, knuckle-walking chimps and gorillas. She had relatively short palms and fingers which were flexible, allowing her to support her body weight on her palms while moving along tree branches, but she had to be a careful climber because she lacked the anatomical features that allow modern-day African apes to swing, hang and easily move through the trees.

  • The pelvis and hip show the gluteal muscles were positioned so she could walk upright.

  • Her feet were rigid enough for walking but still had a grasping big toe for use in climbing.

The research was funded by the National Science Foundation, the Institute of Geophysics and Planetary Physics of the University of California, Los Alamos National Laboratory, the Japan Society for the Promotion of Science and others.

Monday, September 21, 2009

Born to be Big

Early exposure to common chemicals may be programming kids to be fat.
By
Sharon Begley NEWSWEEK
Published Sep 11, 2009
From the magazine issue dated Sep 21, 2009

It’s easy enough to find culprits in the nation's epidemic of obesity, starting with tubs of buttered popcorn at the multiplex and McDonald's 1,220-calorie deluxe breakfasts, and moving on to the couch potatofication of America. Potent as they are, however, these causes cannot explain the ballooning of one particular segment of the population, a segment that doesn't go to movies, can't chew, and was never that much into exercise: babies. In 2006 scientists at the Harvard School of Public Health reported that the prevalence of obesity in infants under 6 months had risen 73 percent since 1980. "This epidemic of obese 6-month-olds," as endocrinologist Robert Lustig of the University of California, San Francisco, calls it, poses a problem for conventional explanations of the fattening of America. "Since they're eating only formula or breast milk, and never exactly got a lot of exercise, the obvious explanations for obesity don't work for babies," he points out. "You have to look beyond the obvious."
The search for the non-obvious has led to a familiar villain: early-life exposure to traces of chemicals in the environment. Evidence has been steadily accumulating that certain hormone-mimicking pollutants, ubiquitous in the food chain, have two previously unsuspected effects. They act on genes in the developing fetus and newborn to turn more precursor cells into fat cells, which stay with you for life. And they may alter metabolic rate, so that the body hoards calories rather than burning them, like a physiological Scrooge. "The evidence now emerging says that being overweight is not just the result of personal choices about what you eat, combined with inactivity," says Retha Newbold of the National Institute of Environmental Health Sciences (NIEHS) in North Carolina, part of the National Institutes of Health (NIH). "Exposure to environmental chemicals during development may be contributing to the obesity epidemic." They are not the cause of extra pounds in every person who is overweight—for older adults, who were less likely to be exposed to so many of the compounds before birth, the standard explanations of genetics and lifestyle probably suffice—but environmental chemicals may well account for a good part of the current epidemic, especially in those under 50. And at the individual level, exposure to the compounds during a critical period of development may explain one of the most frustrating aspects of weight gain: you eat no more than your slim friends, and exercise no less, yet are still unable to shed pounds.
The new thinking about obesity comes at a pivotal time politically. As the debate over health care shines a light on the country's unsustainable spending on doctors, hospitals, and drugs, the obese make tempting scapegoats. About 60 percent of Americans are overweight or obese, and their health-care costs are higher: $3,400 in annual spending for a normal-weight adult versus $4,870 for an obese adult, mostly due to their higher levels of type 2 diabetes, heart disease, and other conditions. If those outsize costs inspire greater efforts to prevent and treat obesity, fine. But if they lead to demonizing the obese—caricaturing them as indolent pigs raising insurance premiums for the rest of us—that's a problem, and not only for ethical reasons: it threatens to obscure that one potent cause of weight gain may be largely beyond an individual's control.
That idea did not have a very auspicious genesis. In 2002 an unknown academic published a paper in an obscure journal. Paula Baillie-Hamilton, a doctor at Stirling University in Scotland whose only previous scientific paper, in 1997, was titled "Elimination of Firearms Would Do Little to Reduce Premature Deaths," reported a curious correlation. Obesity rates, she noted in The Journal of Alternative and Complementary Medicine, had risen in lockstep with the use of chemicals such as pesticides and plasticizers over the previous 40 years. True enough. But to suggest that the chemicals caused obesity made as much sense as blaming the rise in obesity on, say, hip-hop. After all, both of those took off in the 1970s and 1980s.
Despite that obvious hole in logic, the suggestion of a link between synthetic chemicals and obesity caught the eye of a few scientists. For one thing, there was no question that exposure in the womb to hormonelike chemicals can cause serious illness decades later. Women whose mothers took the antimiscarriage, estrogenlike drug DES during pregnancy, for instance, have a high risk of cervical and vaginal cancer. In that context, the idea that exposure to certain chemicals during fetal or infant development might "program" someone for obesity didn't seem so crazy, says Jerrold Heindel of NIEHS. In 2003 he therefore wrote a commentary, mentioning Baillie-Hamilton's idea, in a widely read toxicology journal, bringing what he called its "provocative hypothesis" more attention. He underlined one fact in particular. When many of the chemicals Baillie-Hamilton discussed had been tested for toxicity, researchers focused on whether they caused weight loss, which is considered a toxic effect. They overlooked instances when the chemicals caused weight gain. But if you go back to those old studies, Heindel pointed out, you see that a number of chemicals caused weight gain—and at low doses, akin to those that fetuses and newborns are exposed to, not the proverbial 800 cans of diet soda a day. Those results, he says, had "generally been overlooked."
Scientists in Japan, whose work Heindel focused on, were also finding that low levels of certain compounds, such as bisphenol A (the building block of hard, polycarbonate plastic, including that in baby bottles), had surprising effects on cells growing in lab dishes. Usually the cells become fibroblasts, which make up the body's connective tissue. These prefibroblasts, however, are like the kid who isn't sure what he wants to be when he grows up. With a little nudge, they can take an entirely different road. They can become adipocytes—fat cells. And that's what the Japanese team found: bisphenol A, and some other industrial compounds, pushed prefibroblasts to become fat cells. The compounds also stimulated the proliferation of existing fat cells. "The fact that an environmental chemical has the potential to stimulate growth of 'preadipocytes' has enormous implications," Heindel wrote. If this happened in living animals as it did in cells in lab dishes, "the result would be an animal [with] the tendency to become obese."
It took less than two years for Heindel's "if" to become reality. For 30 years his colleague Newbold had been studying the effects of estrogens, but she had never specifically looked for links to obesity. Now she did. Newbold gave low doses (equivalent to what people are exposed to in the environment) of hormone-mimicking compounds to newborn mice. In six months, the mice were 20 percent heavier and had 36 percent more body fat than unexposed mice. Strangely, these results seemed to contradict the first law of thermodynamics, which implies that weight gain equals calories consumed minus calories burned. "What was so odd was that the overweight mice were not eating more or moving less than the normal mice," Newbold says. "We meas-ured that very carefully, and there was no statistical difference."
On the other side of the country, Bruce Blumberg of the University of California, Irvine, had also read the 2002 Baillie-Hamilton paper. He wasn't overly impressed. "She was peddling a book with questionable claims about diets that 'detoxify' the body," he recalls. "And to find a correlation between rising levels of obesity and chemicals didn't mean much. There's a correlation between obesity and a lot of things." Nevertheless, her claim stuck in the back of his mind as he tested environmental compounds for their effects on the endocrine (hormone) system. "People were testing these compounds for all sorts of things, saying, 'Let's see what they do in my [experimental] system,' " Blumberg says. "But cells in culture are not identical to cells in the body. We had to see whether this occurred in live animals."
In 2006 he fed pregnant mice tributyltin, a disinfectant and fungicide used in marine paints, plastics production, and other products, which enters the food chain in seafood and drinking water. "The offspring were born with more fat already stored, more fat cells, and became 5 to 20 percent fatter by adulthood," Blumberg says. Genetic tests revealed how that had happened. The tributyltin activated a receptor called PPAR gamma, which acts like a switch for cells' fate: in one position it allows cells to remain fibroblasts, in another it guides them to become fat cells. (It is because the diabetes drugs Actos and Avandia activate PPAR gamma that one of their major side effects is obesity.) The effect was so strong and so reliable that Blumberg thought compounds that reprogram cells' fate like this deserved a name of their own: obesogens. As later tests would show, tributyltin is not the only obesogen that acts on the PPAR pathway, leading to more fat cells. So do some phthalates (used to make vinyl plastics, such as those used in shower curtains and, until the 1990s, plastic food wrap), bisphenol A, and perfluoroalkyl compounds (used in stain repellents and nonstick cooking surfaces).
Programming the fetus to make more fat cells leaves an enduring physiological legacy. "The more adipocytes, the fatter you are," says UCSF's Lustig. But adipocytes are more than passive storage sites. They also fine-tune appetite, producing hormones that act on the brain to make us feel hungry or sated. With more adipocytes, an animal is doubly cursed: it is hungrier more often, and the extra food it eats has more places to go—and remain.
Within a year of Blumberg's groundbreaking work, it became clear that altering cells' fate isn't the only way obesogens can act, and that exotic pollutants aren't the only potential obesogens. In 2005 Newbold began feeding newborn rats genistein, an estrogenlike compound found in soy, at doses like those in soy milk and soy formula. By the age of 3 or 4 months, the rats had higher stores of fat and a noticeable increase in body weight. And once again, mice fed genistein did not eat significantly more—not enough more, anyway, to account for their extra avoirdupois, suggesting that the compound threw a wrench in the workings of the body's metabolic rate. "The only way to gain weight is to take in more calories than you burn," says Blumberg. "But there are lots of variables, such as how efficiently calories are used." Someone who uses calories very efficiently, and burns fewer to stay warm, has more left over to turn into fat. "One of the messages of the obesogens research is that prenatal exposure can reprogram metabolism so that you are predisposed to become fat," says Blumberg.
The jury is still out on whether soy programs babies to be overweight—some studies find that it does, other studies that it doesn't—but Newbold didn't want her new grandchild to be a guinea pig in this unintentional experiment. When her daughter mentioned that she was planning to feed the baby soy formula, as about 20 percent of American mothers do, Newbold said she would cover the cost of a year's worth of regular formula if her daughter would change her mind. (She did.) As a scientist rather than a grandmother, however, Newbold hedged her bets. "Whether our results can be extrapolated to humans," she said in 2005, "remains to be determined."
Another challenge to the simplistic calories-in/calories-out model came just this month. The time of day when mice eat, scientists reported, can greatly affect weight gain. Mice fed a high-fat diet during their normal sleeping hours gained more than twice as much weight as mice eating the same type and amount of food during their normal waking hours, Fred Turek of Northwestern University and colleagues reported in the journal Obesity. And just as Newbold found, the two groups did not differ enough in caloric intake or activity levels to account for the difference in weight gain. Turek suspects that one possible cause of the difference is the disruption in the animals' circadian rhythms. Genes that govern our daily cycle of sleeping and waking "also regulate at least 10 percent of the other genes in our cells, including metabolic genes," says Turek. "Mess up the cellular clock and you may mess up metabolic rate." That would account for why the mice that ate when they should have slept gained more weight: the disruption in their clock genes lowered their metabolic rate, so they burned fewer calories to keep their body running. Studies in people have linked eating at odd times with weight gain, too.
Mice are all well and good, but many a theory has imploded when results in lab animals failed to show up in people. Unfortunately, that is not the case with obesogens. In 2005 scientists in Spain reported that the more pesticides children were exposed to as fetuses, the greater their risk of being overweight as toddlers. And last January scientists in Belgium found that children exposed to higher levels of PCBs and DDE (the breakdown product of the pesticide DDT) before birth were fatter than those exposed to lower levels. Neither study proves causation, but they "support the findings in experimental animals," says Newbold. They "show a link between exposure to environmental chemicals … and the development of obesity."
Given the ubiquity of obesogens, traces of which are found in the blood or tissue of virtually every American, why isn't everyone overweight? For now, all scientists can say is that even a slight variation in the amounts and timing of exposures might matter, as could individual differences in physiology. "Even in genetically identical mice," notes Blumberg, "you get a range of reactions to the same chemical exposure." More problematic is the question of how to deal with this cause of obesity. If obesogens have converted more precursor cells into fat cells, or have given you a "thrifty" metabolism that husbands calories like a famine victim, you face an uphill climb. "It doesn't mean you can't work out like a demon and strictly control what you eat," says Blumberg, "but you have to work at it that much harder." He and others are quick to add that obesogens do not account for all cases of obesity, especially in adults. "I'd like to avoid the simplistic story that chemicals make you fat," says Blumberg. For instance, someone who was slim throughout adolescence and then packed on pounds in adulthood probably cannot blame it on exposure to obesogens prenatally or in infancy: if that were the cause, the extra fat cells and lower metabolic rate that obesogens cause would have shown themselves in childhood chubbiness.
This fall, scientists from NIH, the Food and Drug Administration, the Environmental Protection Agency, and academia will discuss obesogens at the largest-ever government-sponsored meeting on the topic. "The main message is that obesogens are a factor that we hadn't thought about at all before this," says Blumberg. But they're one that could clear up at least some of the mystery of why so many of us put on pounds that refuse to come off.

Friday, September 18, 2009

Gender Testing of Female Athletes

This is a REALLY interesting website from the Howard Hughes Medical Institute (HHMI) about how to verify a persons gender. Click on the link and scroll down to 'Gender Testing for Female Athletes.... find out why'.
Read the information then 'begin exploring'.
Be sure to read about male development and CAIS.

http://www.hhmi.org/biointeractive/gender/index.html

What do you think? Share your comments with the class.

Wednesday, September 16, 2009

Male bass in many US rivers feminized, study finds

By SETH BORENSTEIN, AP Science Writer
– Mon Sep 14, 5:54 pm ET

WASHINGTON – Government scientists figure that one out of five male black bass in American river basins have egg cells growing inside their sexual organs, a sign of how widespread fish feminizing has become.
The findings come from the U.S. Geological Survey in its first comprehensive examination of intersex fish in America, a problem linked to women's birth control pills and other hormone treatments that seep into rivers. Sporadic reports of feminized fish have been reported for a few years.
The agency looked at past data from nine river basins — covering about two-thirds of the country — and found that about 6 percent of the nearly 1,500 male fish had a bit of female in them. The study looked at 16 different species, with most not affected.
But the fish most feminized are two of the most sought-after freshwater sportfish: the largemouth and smallmouth, which are part of the black bass family. Those two species were also the most examined with nearly 500 black bass tallied.
"It's widespread," said USGS biologist Jo Ellen Hinck. She is the lead author of the study, published online this month in Aquatic Toxicology. She said 44 percent of the sites where black bass were tested had at least one male with egg cells growing inside.
Past studies have linked the problem to endocrine-disrupting hormones, such as estrogen from women's medicines. While the fish can still reproduce, studies have shown they don't reproduce as well, Hinck said.
Intersex fish are also seen as a general warning about what some experts see as a wider problem of endocrine disruptors in the environment.
The egg cells growing in the male fish's gonads can only be seen with a microscope after the fish has been caught and dissected.
The study used data from 1995 to 2004, when the government stopped funding the research. The only river basin examined that didn't show any problems was Alaska's Yukon River Basin.
The Southeast, especially the Pee Dee River Basin in North and South Carolina, had the highest rates of feminization. In Bucksport, S.C., 10 of 11 largemouth bass examined were intersex. In parts of the Mississippi River in Minnesota and the Yampa River in Colorado, 70 percent of the smallmouth bass had female signs.
Hinck said black bass seem to be more prone to the problem, but researchers don't know why. She also found one common carp that was female with bits of male testes growing inside.

Sunday, September 13, 2009

A Spineless Solution

Sep 3rd 2009
From The Economist print edition
A better way to find novel antibiotics

Science Photo Library Wriggle for the camera, please
NEW antibiotics are always welcome. Natural selection means the existing ones are in constant danger that pathogens will evolve resistance to them. But winnowing the few chemicals that have antibiotic effects from the myriad that might do, but don’t, is tedious. So a technique invented recently by Frederick Ausubel of Harvard University and his colleagues, which should help to speed things up, is welcome.
Dr Ausubel’s method, the details of which have just been published in ACS Chemical Biology, employs nematode worms of a species called C. elegans as its sacrificial victims. C. elegans is one of the most intensively studied animals on Earth (it was the first to have its genome read completely). It is a mere millimetre long, and can be mass produced to order, so it is ideal for this sort of work.
Dr Ausubel set out to make an automated system that could infect worms with bacteria, treat them with chemical compounds that might have antibiotic effects, and then record the results. The device he has built starts by laying the worms on a “lawn” of pathogenic bacteria for 15 hours and then mixing them with water to create a sort of worm soup. It then places the infected worms into individual enclosures, using a machine called a particle sorter that is able to drop a precise number of worms (in this case 15) into each of 384 tiny wells arrayed on a single plate. These wells have, in turn, each been pre-loaded with a different chemical that is being tested for possible antibiotic properties. Once in place, the worms are left alone for five days.
Until now, researchers engaging in this sort of work have had to monitor each wellful of worms by eye (assisted by a microscope) to determine whether the inmates were alive or dead. To avoid this time-consuming process, Dr Ausubel and his team exposed their worms to an orange stain once the five days were over. The stain in question enters dead cells easily, but cannot enter living ones. They were thus able to distinguish the quick from the dead by colour, rather than propensity to wriggle.
Moreover, using a stain in this way meant they could automate the process by attaching a camera to the microscope, taking photographs of all 384 wells, and feeding the images into a computer that had been programmed to measure the area of orange in a well and contrast that with the total area occupied by worms. When they compared this automated mechanism for identifying dead worms with manual methods that depended upon human eyes, they found it was every bit as effective.
So far Dr Ausubel and his colleagues have managed to test around 37,000 compounds using their new method, and they have found 28 that have antibiotic properties. Their most exciting discovery is that some of these substances work in completely different ways from existing antibiotics. That means entirely new types of resistance mechanism would have to evolve in order for bacteria to escape their effects.
Mass screening of this sort is not, itself, a new idea in the search for drugs, but extending it so that it can study effects on entire animals rather than just isolated cells should make it even more productive. And worms, unlike, say, white mice, have few sentimental supporters in the outside world.

Friday, September 11, 2009

Liposuction Fat Turned Into Stem Cells, Study Says

John Roach for National Geographic News
September 8, 2009

The research appears online today in the journal Proceedings of the National Academy of Sciences.

Using leftovers from liposuction patients, scientists have turned human fat into stem cells, a new study says.
The discovery may also help avoid the controversy spawned by the use of stem cells from human embryos.
Human fat is "an abundant natural resource and a renewable one," said Stanford University plastic surgeon Michael Longaker, whose liposuction patients donated the fat for the study.
Longaker envisions a future in which doctors will be able to use fat from a patient to grow, in a lab, new tissues and organs for that patient.
The opportunity wouldn't be limited to the obese.
"Even if you're in great shape, there is still enough fat to be harvested from the vast majority of patients," added Longaker, who co-authored the study.
From Fat to Stem Cells to New Organs?
The reprogrammed cells, called induced pluripotent stem cells, or iPS cells, are capable of turning into most types of cells in the body.
Scientists are keen to obtain these cells to study disease and, one day, use them to grow new tissue and replacement organs.
Previously, researchers had shown that
they could derive this type of stem cell from ordinary skin cells.
But the fat technique is about twice as fast and 20 times more efficient, said Joseph Wu, the study's senior author.
"We can get iPS-like colonies, basically, in about 16 days, compared to 28 days to 32 days using [skin]," said Wu, a Stanford stem cell expert. "And if you count the number of colonies in [skin] versus fat ... we get about 20 times more the number of iPS colonies."
Reprogramming Cells
To create the stem cells, the scientists injected Trojan horse-like viruses into smooth muscle cells found in fat that surrounds blood vessels. Once inside, the viruses introduced genes that reprogrammed the cells, spurring them to grow into new forms.
Previously, this process had required growing the stem cells in a culture dish with nutrients from mouse cells. This had raised alarms about the potential for contamination from mouse proteins—a potential obstacle to government approval, Longaker, the plastic surgeon, said.
That the new method works at all is "somewhat surprising" and remains something of a mystery, Longaker said.
Sidestepping Stem Cell Controversy
The fat and skin methods allow researchers to sidestep the ethical controversy over the use of embryonic stem cells from cell lines originally harvested from unused human embryos from in vitro fertilization clinics.
In addition, Longaker noted, tissue or organs grown from a patient's own stem cells should be less likely to be rejected by the body.
The speediness of the fat method, in particular, could be lifesaving, he added.
For example, if a surgeon wanted to implant new heart tissue—derived from a heart attack victim's own fat—into a patient, the doctor might have only a short time before scar tissue would compromise the operation.
If he or she were able to generate the tissue within a few weeks, Longaker said, that "would be a big deal."

Wednesday, September 9, 2009

Strange jellies of the icy depths

Matt Walker
September 1, 2009
Editor, Earth News


http://news.bbc.co.uk/earth/hi/earth_news/newsid_8231000/8231367.stm


Crossota millsae, a brilliant red and purple jellyfish found at a depth of 2000m in the Arctic Ocean, is also found off California and Hawaii.


New details are emerging about the life-forms that survive in one of the world's most inaccessible places.
Scientists have published descriptions of a range of jelly-like animals that inhabit the deep oceans of the Arctic.
The animals were originally filmed and photographed during a series of submersible dives in 2005.


The small blue jelly, a type of Narcomedusae, is new to science.

One of the biggest surprises is that one of the most common animals in the Arctic deep sea is a type of jellyfish that is completely new to science.
The deep Arctic ocean is isolated from much of the water elsewhere on the globe. One area, known as the Canadian Basin, is particularly cut off by deep-sea ridges. These huge barriers can isolate any species there from other deep-water animals.
So in 2005, an international team of scientists, funded primarily by the US National Oceanic and Atmospheric Administration's Office of Ocean Exploration and Research, conducted a series of deep-sea dives using a remote operated vehicle (ROV).

The large bright orange Aulacoctena species may get its colour from worms that it eats

Details of what they found have now been published in the journal Deep Sea Research Part II.
"There were a lot of surprises," says biologist Dr Kevin Raskoff of Monterey Peninsula College in California, US, a leading member of the dive team.
"One thing was just how many different jellies there were, and the sizes of their populations."
"Some were somewhat well known from other oceans, but had not previously been found in the Arctic. That caused us to rethink our ideas about what the typical habitat would be for the species. We also discovered a number of new species that had not been found before."


Chrysaora melanaster is one of the largest Arctic jellies, living in the top layer of water at depths of between 20m and 40m, where the temperature remains nearly constant.
During a series of dives to depths of 3000m, the ROV filmed over 50 different types of gelatinous or jelly-like animal.
The majority of animals recorded were Medusae, a particular type of jellyfish that tend to be bell or disc shaped.

This red-lipped cydippid ctenophore was a common deep-water species between 1,300 to 2,400m. It still awaits description

Other jelly-like creatures seen included ctenophores, an unusual group that can look like jellyfish, but are not able to sting, siphonophores, which are actually colonies of smaller animals living together in a structure that looks like a single, larger animal, and larvaceans, plankton-like creatures unrelated to jellyfish.
Of all the Medusae observed, two species dominated at most locations visited by the ROV.
The first was a species called Sminthea arctica, which lived at depths ranging from 100m to 2,100m. This jellyfish has been recorded before by scientific expeditions.

Crossota millsae is a brilliant red and purple jellyfish also found off California and Hawaii. This specimen was collected near the bottom of the Arctic Ocean in 2,000m of water.

However, the other common jelly was a species new to science.
"Probably the single most interesting discovery was a new species of a small blue jellyfish, from a group called the Narcomedusae," says Dr Raskoff.
"This group has several interesting features that set them apart from typical jellyfish, such as the fact that they hold their tentacles over their bell as they swim."
Most jellyfish let their tentacles drift in the water behind them, but the new species holds its tentacles out in front, perhaps enabling it to better catch prey.
The new species is so unusual that it has been classified within its own genus, and will be formally described later this year.
"It was also the third most common jellyfish found on the cruise, which is really surprising when you think about the fact that even the most common species in the area can be totally new and unexpected species," says Dr Raskoff.
Another striking find was a type of ctenophore called Aulacoctena, which is one of the most spectacular examples of its kind.
At over 15cm long, its tentacles can grip almost anything underwater, yet little is known about its lifestyle.
However, one of the specimens collected by the ROV ejected its stomach contents, which revealed it may had fed on a bright orange animal.
The researchers suspect it feeds on bright orange worms that also live in the Arctic deep, and it gets it colour from its prey.
The scientists are now keen to find out much more about how these strange and enigmatic creatures interact with their environment, and how they influence or underpin the ecology of the deep ocean in which they live.
They also hope to raise funds to explore other little-visited regions of the deep Arctic ocean, as well as exploring the Aleutian trench off the coast of Alaska.
"You don't have to go too far to find interesting areas to study, you just have to dive deep," says Dr Raskoff.

Tuesday, September 1, 2009

Genetic test detects infections before symptoms appear

By Steve Sternberg, USA TODAY
8/6/2009

Flu sufferers of the future may not have to wait until their fever spikes to learn they're ill, scientists said Thursday.

Geoffrey Ginsburg of Duke University and his colleagues say that they've developed an experimental genetic test that can detect infections before any symptoms appear.
Although the test cannot yet distinguish one virus from another, it can tell the difference between a bacterial and a viral illness, Ginsburg says.
Most diagnostic tests detect the germ itself or antibodies produced by the immune system to wipe out a virus or bacteria once the symptoms begin. The new approach works differently, by detecting genetic signs of infection in people who aren't sick yet.
The test betrays the activation of genes that govern an immune response. It is carried out using a silicon chip much like those used in computers and requires no more than the 10 microliters of blood from a finger-prick.
"This is the first major step in using a person's individual response to a viral or bacterial infection to lead to better diagnostics for infectious disease," Ginsburg says.
The goal of the research, sponsored by the U.S. military's
Defense Advanced Research Projects Agency (DARPA), is to develop a device that can identify troops who are getting sick, in time to get them treated and to prevent them from infecting others. Ultimately, Ginsberg says, the method could also become a valuable diagnostic tool in emergency rooms and doctors' offices, where a simple test could tell the difference between the worried well and the genuinely sick.
"The true market for this may be in doctors' offices around the world, where kids are coming in with fevers and doctors have to make decisions about giving them an antibiotic," he says, noting that antibiotics can be expensive, have side effects and promote the spread of drug-resistant germs.

Arnold Monto, of the University of Michigan School of Public Health, says a test that could accurately identify viral illness, especially influenza, would be a critically important advance. "A chip like this would be great," Monto says. Not only would it help doctors more accurately diagnose people who are ill, but it would also provide public health officials with information critical to their efforts to fight epidemics.
The accuracy of the rapid tests that are currently available is inconsistent, ranging from 30% to 80%, and prompting public health officials to caution against their use. An accurate genetic test would serve multiple purposes, Monto says.
"If we could identify people before they get sick — say, in household studies — we could get a better idea of how the virus is transmitting," he says. "How influenza is transmitted affects (predictions of an epidemic's spread), control measures and personal protective measures," shedding light on how best to keep from getting infected.
Ginsburg and his team tried out the method in volunteers they infected with cold viruses, flu viruses and a less well-known cause of upper airway disease called respiratory syncytial virus. The study appears today in the journal Cell, Host & Microbe.
The study involved 57 volunteers. Their blood was drawn before and after they were infected so that each one could serve as a healthy control.
Researchers were able to tell the difference someone who was infected and someone who wasn't with 95% accuracy. The test could also distinguish between people who were infected with viruses and those with bacteria more than 93% of the time.
The researchers are trying to determine whether the test will work in patients infected with H1N1, or
swine, flu.

Monday, August 31, 2009

Scientists may have new tool in bacteria fight

Methicillin-resistant Staphylococcus aureus bacteria is commonly known as MRSA and is a major source of infections in hospitals. (CDC) -

By Scott LaFee
Union-Tribune Staff Writer
August 28, 2009


For years, scientists and doctors have watched with frustration as their arsenal of antibiotics has been reduced by the growing and inevitable emergence of drug-resistant bacteria and other microbes.

What they have needed, what modern medicine requires, is a new way to attack and kill infectious, disease-causing pathogens such as tuberculosis and multidrug-resistant staphylococcus.

In a paper published today in the journal Chemistry & Biology, researchers at Burnham Institute for Medical Research in La Jolla, with colleagues at the University of Texas Southwestern Medical Center and the University of Maryland, say they may have found the basis for a new class of antibacterial agents capable of overcoming current multidrug resistance.

“The importance of emerging antibiotic-resistant pathogens cannot be overstated,” said Dr. Victor Nizet, who studies human immunology and infectious disease at the University of California San Diego. “There's no drug currently in clinical medicine in which there isn't at least one resistant strain of pathological microbe.”

Andrei Osterman, an associate professor at Burnham and member of its Infectious and Inflammatory Disease Center, said the new findings were not a silver bullet. “What we've found is a golden target,” he said.

That target is a bacterial enzyme called NadD, or nicotinate mononucleotide adenylyltransferase. If the enzyme is absent or its activity is suppressed, the bacterium dies. Versions of NadD are found in almost all cells, including human.

Using computer modeling, the researchers matched more than a million chemical compounds against the enzyme, eventually identifying a handful that interacted with and inhibited NadD activity. Subsequent experiments using E. coli and anthrax bacteria confirmed the compounds' inhibitory potential. But the compounds do not affect the human version of the NadD enzyme because its molecular structure is different.

“We're a long way yet from having an actual, new class of antibiotics, something that microbial pathogens have not seen before,” Osterman said. “But this is a major step. It is proof of concept. We've proved that this enzyme is a good target and that by suppressing it, you can kill bacteria.”
Osterman said it will require several more years of testing and research before any new, NadD-based antibiotic might emerge. The newly identified compounds must be further refined and improved, tested for toxicity and effectiveness in animal models and, eventually, in humans.
The bulk of that research will likely occur in academic and institutional settings, not within the pharmaceutical industry, which broadly views antibiotic research as less lucrative than other endeavors. Osterman's research was funded by a grant from the National Institute of Allergy and Infectious Diseases.

Development of a new and effective broad-spectrum antibiotic can't come soon enough. Antibiotic resistance has become a major medical issue, with some pathogens having evolved through random mutations and the misuse of antibiotics to become almost invulnerable to the one-time “wonder drugs.”

Indeed, more than half of all Staphylococcus aureus infections in U.S. hospitals, where it is a persistent scourge – are resistant to formerly potent antibiotics such as penicillin, methicillin, tetracycline and erythromycin, according to the Centers for Disease Control and Prevention. In such cases, only the strongest, newest antibiotics, such as vancomycin, work – and some bacterial strains are now resistant to it.

For a variety of reasons, most pharmaceutical research has tended to focus on tweaking existing drugs to keep them effective or relevant, said UCSD's Nizet, who is also on Burnham's scientific advisory board. But that's clearly not enough.
New chemicals with antibiotic properties must be found – new chinks in microbial armor revealed.

“This research is an example of the latter,” Nizet said. “It's one of a handful of approaches that all have to be pursued in combination because things aren't going to get better anytime soon. Bacteria is perfect proof of evolution in action, constantly adapting to selective life-and-death pressures. These pathogens are not going to go away.”

Sunday, August 23, 2009

Row over South African athlete highlights ambiguities of gender

By Thomas H. Maugh II
Los Angeles Times

August 21, 2009

Some have raised doubts whether Caster Semenya is a woman. But a scientist says physical features do not always match DNA or hormones. A variety of genetic and hormonal anomalies can lead to ambiguity.

Despite what one might think, it is not always easy to tell who is male and who is female.

In sporting events, officials who watch athletes produce a urine specimen usually can see immediately whether their genitals match their proclaimed sex. But that analysis can leave room for doubt, as with South African runner Caster Semenya.
The problem, said Dr. Joe Leigh Simpson, a pediatric geneticist at Florida International University, is that genetic or hormonal abnormalities can affect any organ system "and the gonads and external genitalia are not exempt from that." When such anomalies do occur, "it can produce confusion" because hormone levels and other aspects of physiology may not match appearance.

Moreover, there is "no single process" for determining sex because every case is different, he added. For years, sports authorities considered only the sex chromosomes: If they are XX, the athlete is female, XY and he is male. Technicians would swab the athlete's mouth to remove some cells, look at the sex chromosomes and make a determination.

But at the 1996 Atlanta Olympics, eight female athletes were determined to have XY chromosomes and were booted from the Games. Further studies, however, showed that they were physiologically female even though their genes said they were male, and they were reinstated.

Genes are only a blueprint, and sometimes nature doesn't follow the blueprint precisely. Take the examples of XY athletes who appear to be women. At least five enzymes are required to synthesize testosterone, the hormone that produces most male characteristics, and occasionally one of those enzymes is defective. When that happens, the genitals are typically male and tiny, the person doesn't have much body hair and he is generally feminized. By determining which testosterone precursor is present in unusually large amounts, researchers can determine which enzyme is defective. Such people are normally eligible to compete as women.

In other genetic males, the receptor that the testosterone binds to is defective and it doesn't matter how much testosterone is present. That male is classified as androgen resistant, but the results are the same: feminization. Genitalia are typically female. Often, such people are raised as females and don't find out they are genetically male until they don't get menstrual periods. They are often tall, slender and attractive, and there has been speculation that movie stars Marlene Dietrich and Greta Garbo were in this category.

Neither anomaly gives the person strength or endurance beyond that of a normal female and subjects are allowed to compete as women.

The condition of being genetically female, or XX, but appearing male can often be traced to congenital adrenal hyperplasia, in which the adrenal glands produce excess testosterone. The woman may look like a boy with tiny male genitals, but once a month will pass blood. If the condition is caught early in life, doctors usually recommend surgery to create female genitalia. But many persons with the condition live normal lives as men.

Another possibility that could account for a disconnect between genetics and appearance is mosaicism, in which the individual has more than one set of genes, in some cases, some cells could be XX and others XY. That occurs because of a faulty division at a very early stage in the embryo, or if two embryos fuse, and can produce a variety of mixed sexual signals.

The genetics community has "well-oiled machinery" to detect and deal with sexual abnormalities in newborns, Simpson said. But when it comes to adults, the process may end up being little more than a judgment call. The ultimate determinate in sporting events: Does the abnormality give an unusual benefit?