Monday, March 28, 2011

Battling Hypertension

As you learned earlier, you can tackle high blood pressure in two ways—through lifestyle changes and with a slew of safe, effective medications. If you have prehypertension, lifestyle changes may help you stave off hypertension for years to come. If you fall into the hypertension category, you’ll probably need drugs to fight the disease, although the right habits can reduce the dose you need.

Here are the changes you can make:

• Excess weight. Lose it. Obesity triggers a cascade of changes in your body, eventually raising your blood pressure.

• Diet. High levels of salt can raise blood pressure, but the magnitude of the effect depends on the person—some people are more sensitive to the effects of salt than others. Your best chance to change your blood pressure through diet is by following the draconian DASH diet, which promotes eating fruits and vegetables and strictly limits salt, saturated fat, and alcohol. For more information, see http://dashdiet.org to learn the basic principles of healthy eating.

• Exercise. Your body benefits from any physical activity, even if it’s just a brisk walk once a day. But the best way to battle high blood pressure is to include three 30-minute sessions of moderate-intensity aerobic exercise every week.

• Stress. Everyone encounters stressful events that cause a temporary increase in blood pressure. (In fact, even the stress of having a highly credentialed medical professional measure your blood pressure can cause a 30-point rise. This transitory phenomenon is called white coat syndrome, after every doctor’s favorite attire.) However, the effect of chronic, unrelenting stress is still unclear. To be on the safe side, take the time to relax, get proper sleep, and avoid situations of powerlessness (for example, working for an abusive boss).

Source of Information : Oreilly - Your Body Missing Manual

Wednesday, March 23, 2011

Can people ever lose their fingerprints?

Fingerprints can indeed be removed, both intentionally and unintentionally. The May 2009 issue of the Annals of Oncology reported (online) a striking example of the latter case: a 62-year-old man from Singapore was detained while traveling to the U.S. because a routine fingerprint scan showed that he actually had none.

The man, identified only as Mr. S, had been taking the chemotherapy drug capecitabine (brand name Xeloda) to keep head and neck cancer in check. The medication gave him a moderate case of hand-foot syndrome (also called chemotherapy-induced acral erythema), which can cause swelling, pain and peeling on the palms and soles of the feet—and, apparently, loss of fingerprints. Mr. S, who was freed when officials decided he was not a security risk, says he had not noticed that his fingerprints had vanished before he set out on his trip. After the incident, Mr. S’s physician, who authored the paper, found informal online mentions of other chemo patients complaining of lost fingerprints.

Edward P. Richards, director of the Program in Law, Science and Public Health at Louisiana State University, says that other diseases, rashes and the like can have the same effect. “Just a good case of poison ivy would do it.” But he observes that “left alone, your skin replaces at a fairly good rate, so unless you’ve done permanent damage to the tissue, it will regenerate.”

Kasey Wertheim, who is president of Complete Consultants Worldwide and has done forensic and biometric work for the U.S. Department of Defense and Lockheed Martin, says that the people who most often lose their fingerprints seem to be bricklayers, who wear down print ridges handling rough, heavy materials, as well as “people who work with lime [calcium oxide], because it’s really basic and dissolves the layers of the skin.” Secretaries may also have their prints obliterated, he adds, “because they deal with paper all day. The constant handling of paper tends to wear down the ridge detail.”

“Also,” Wertheim, notes, “the elasticity of skin decreases with age, so a lot of senior citizens have prints that are difficult to capture. The ridges get thicker; the height between the top of the ridge and the bottom of the furrow gets narrow, so there’s less prominence.” Burning—with heat or chemicals—can blot out fingerprints as well, but then the resulting scars can become a unique identifier.

Wertheim says that many cases of intentional fingerprint mutilation have been documented. Usually in these instances, people damage the layer of skin that forms the “template” for the fingerprint and the epidermis at the surface.

The first case of documented fingerprint mutilation, he points out, was back in 1934, by Theodore “Handsome Jack” Klutas, who was head of a gang known as the College Kidnappers. “When the police finally caught up with him, Klutas went for his gun, and the police returned fire, killing him,” Wertheim recounts. “When they compared his postmortem fingerprints, police found that each of his prints had been cut by a knife, resulting in semicircular scars around each fingerprint. Although he was glorified in the media, it was an amateur job; the procedure left more than enough ridge detail to identify him.”

Source of Information :  Scientific American Magazine February 2010

Monday, March 14, 2011

Lost Giants

Did mammoths vanish before, during and after humans arrived?

Before humans arrived, the Americas were home to woolly mammoths, saber-toothed cats, giant ground sloths and other behemoths, an array of megafauna more impressive than even Africa boasts today. Researchers have advanced several theories to explain what did them in and when the event occurred. A series of discoveries announced last fall, at fi rst glance apparently contradictory, add fresh details to the mystery of this mass extinction. One prominent theory pegs humans as the cause of the demise, often pointing to the Clovis people, who left the earliest clear signs of humans entering the New World roughly 13,500 years ago. The timing coincides with the disappearance of megafauna, suggesting the Clovis hunted the animals to extinction or infected them with deadly disease. Another hypothesis supposes that climate was the culprit: it had swung from cold to warm twice, including a 1,300-year-long chill known as the Younger Dryas; such abrupt shifts might have overwhelmed the creatures’ abilities to adapt.

To pin down when the megafauna vanished, paleoecologist Jacquelyn Gill of the University of Wisconsin–Madison and her colleagues analyzed fossil dung, pollen and charcoal from ancient lake sediments in Indiana. The dung of large herbivores harbors a fungus known as Sporomiella, and its amounts in the dung gives an estimate of how many mammoths and other megafauna were alive at different points in history. Pollen indicates vegetation levels, and charcoal signals how many fires burned; the extent of flora and wildfi res is related to the presence of herbivores, the researchers say in the November 20 Science. Without megaherbivores to keep them in check, broad-leaved tree species such as black ash, elm and ironwood claimed the landscape; soon after, buildups of woody debris sparked a dramatic increase in wildfi res. Putting these data together, Gill and her team conclude that the giant animals disappeared 14,800 to 13,700 years ago—to 1,300 years before Clovis.

A different study, however, suggests that this mass extinction happened during Clovis. Zooarchaeologist J. Tyler Faith of George Washington University and archaeologist Todd Surovell of the University of Wyoming carbon-dated prehistoric North American mammal bones from 31 different genera (groups of species). They found that all of them seemed to meet their end simultaneously between 13,800 to 11,400 years ago, findings they detailed online November 23 in the Proceedings of the National Academy of Science USA.

But if ancient DNA recovered from permafrost is any sign, megafauna survived in the New World millennia after humanity arrived. As the permafrost in central Alaska cracked during springtime thaws, water that held DNA from life in the region leaked in, only to freeze again during the winter. As such, these genes can serve as markers of “ghost ranges”—remnant populations not preserved as fossil bones. Looking at mitochondrial DNA, evolutionary biologist Eske Willerslev of the University of Copenhagen and his colleagues suggest mammoths lasted until at least 10,500 years ago (as did horses, which actually originated in the Americas only to vanish there until the Europeans reintroduced them). The Proceedings of the National Academy of Science USA published those findings online December 14.

Although the three papers appear to conflict with one another, they could be snapshots from the beginning, middle and end of a mass extinction. “If they seem to disagree, it is for the same reason as in that fable about the three blind men trying to describe an elephant—or mammoth?—by touching different parts of it,” says ecologist Christopher Johnson of James Cook University in Australia, who did not take part in any of the studies.

Johnson suggests the fungus research is superb evidence for when the decline began,
but it is not as good at confirming exactly when the extinction was completed, especially over larger areas where sparse populations might have persisted. The DNA finds, on the other hand, can detect late survivors, he says, “maybe very close to the actual time that the last individuals were alive, at least in Alaska.” The bones analyzed from the period roughly in between show that the extinction process afflicted many species simultaneously. Those fossils came from the contiguous U.S., which back then was separated from Alaska by the massive Laurentide and Cordilleran ice sheets and so, Faith notes, could explain why the pattern of extinction differed up there.

So what caused the decline? The jury’s still out, says Willerslev’s collaborator Ross MacPhee of the American Museum of Natural History in New York City. Johnson notes that archaeologists are turning up evidence of humans in the New World before Clovis, and he suggests they overhunted the megafauna. The beautifully crafted fluted spear points linked with the Clovis might reflect strategies that developed once the giants became rare and harder to hunt, Johnson adds.

Even if scientists cannot definitively finger the killer, research into the megafauna disappearance “is directly relevant today because we are in the middle of a mass extinction and one for which we know the cause—us,” Gill says. “Large animals are among the most threatened today,” she points out, and no one wants Africa to follow the ancient experience of the Americas.

Source of Information :  Scientific American Magazine February 2010

Wednesday, March 9, 2011

Hearing with Skin

Saying words such as punt, tackle and kick produces a puff of air that helps the listener distinguish words with similar letter sounds, even though the puffs are so subtle that they go unnoticed. Bryan Gick and Donald Derrick of the University of British Columbia set out to determine if these puffs enhance auditory perception. They had 66 participants listen to recorded sounds while receiving light, imperceptible bursts of air from thin tubes placed either over their hand or neck or in their ear.

In some cases, puffs came with the appropriate sounds (“pa” and “ta”), at other times not (“ba” and “da”). Without any puffs, participants misheard “pa” for “ba” and “ta” for “da” 30 to 40 percent of the time. The accuracy improved 10 to 20 percent when heard with an accompanying air puff over the hand or neck. No improvement took place, however, when an air puff went into the ear, suggesting that the participants were not simply hearing the airflow.

The opposite effect occurred when the volunteers received a puff with the inappropriate sounds “ba” and “da”: the accuracy decreased by about 10 percent if the sounds came with puffs. The researchers described their work, which might lead to improved hearing aids, in the November 26 Nature (Scientific American is part of Nature Publishing Group).

Source of Information :  Scientific American Magazine February 2010

Thursday, March 3, 2011

Testosterone-Fueled Sociability

Do those with more testosterone coursing through their bodies make riskier, more aggressive decisions? To test the popular idea, researchers from Switzerland and the U.K. gave 121 women either 0.5 milligram of the hormone or a placebo and had them play an ultimatum bargaining game in pairs. With real money on the line, one player of the pair had to propose how to split the funds. The other player could reject the offer if she thought it unfair—and if the game ended in a stalemate, no money was distributed.

Given the common wisdom about testosterone, the players who had gotten the testosterone boost should be more likely to take a riskier, more antisocial approach and make a lowball offer in an effort to keep more of the pot. The behavior of the test subjects, however, did not confirm the stereotypes, according to results published online December 8 by Nature (Scientific American is part of Nature Publishing Group). Those who had received testosterone actually made higher offers than those who had gotten the placebo.

Evidently, the testosterone-fueled proposals reduced bargaining conflicts and facilitated the exchange. Those with more of the hormone may have been acting out of a desire to maintain their images by avoiding rejection. The results do not necessarily mean that testosterone has no role in complicating social negotiations, but such a contribution is likely to be more complex than previously thought.

Foods That Increase Testosterone

Source of Information :  Scientific American Magazine February 2010