FOOD & HEALTH SKEPTIC ARCHIVE
Monitoring food and health news
-- with particular attention to fads, fallacies and the "obesity" war
The original version of this blog is HERE. Dissecting Leftism is HERE (and mirrored here). The Blogroll. My Home Page. Email me (John Ray) here. Other mirror sites: Greenie Watch, Political Correctness Watch, Education Watch, Immigration Watch, Gun Watch, Socialized Medicine, Eye on Britain, Recipes, Tongue Tied and Australian Politics. For a list of backups viewable in China, see here. (Click "Refresh" on your browser if background colour is missing) See here or here for the archives of this site
A major cause of increasing obesity is certainly the campaign against it -- as dieting usually makes people FATTER. If there were any sincerity to the obesity warriors, they would ban all diet advertising and otherwise shut up about it. Re-authorizing now-banned school playground activities and school outings would help too. But it is so much easier to blame obesity on the evil "multinationals" than it is to blame it on your own restrictions on the natural activities of kids
NOTE: "No trial has ever demonstrated benefits from reducing dietary saturated fat".
A brief summary of the last 50 years' of research into diet: Everything you can possibly eat or drink is both bad and good for you
28 February, 2011
Why getting a university degree is also the secret to a long life
More stupid causal assumptions. It is politically incorrect to note that high IQ people live longer. And high IQ helps you to get a degree. Need I say more? It's not the degree that gives you a long life but rather the better health associated with a high IQ
Higher education could help you live longer, according to a study. It found people who went to college or university had lower blood pressure as they aged than those whose education finished when they left school in their teens.
With high blood pressure doubling the risk of dying from a heart attack or stroke, according to the Blood Pressure Association, the finding suggests a good education could save your life.
A study found that people who went to college or university had lower blood pressure as they aged than those left school in their teens
A study found that people who went to college or university had lower blood pressure as they aged than those left school in their teens
The biggest health benefits were found among those with master’s degrees or doctorates, and were stronger for women, the journal BMC Public Health reports.
Researchers at Brown University, Rhode Island, who tracked the health of nearly 4,000 American men and women for 30 years, also found highly educated men tended to be thinner and smoked and drank less than those without further education.
Well-educated women also smoked less and were thinner – but drank more than those who did not go to college or university.
With high blood pressure purportedly doubling the risk of dying from a heart attack or stroke, the finding suggests a good education could save your life
With high blood pressure purportedly doubling the risk of dying from a heart attack or stroke, the finding suggests a good education could save your life
The jobs taken by school-leavers may also impact on health.
Study leader Eric Loucks said: ‘Low educational attainment has been demonstrated to predispose individuals to high strain jobs, characterised by high levels of demand and low levels of control, which have been associated with elevated blood pressure.’
He isn’t sure why women’s blood pressure is particularly affected by education – or the lack of it. But it may that lack of education affects a woman’s lifestyle, and so her physical health, more than a man’s.
Dr Loucks said: ‘Women with less education are more likely to be experiencing depression, they are more likely to be single parents, more likely to be living in impoverished areas and more likely to be living below the poverty line.
‘Socio-economic gradients in health are very complex. But there’s a questions of what do we do about it. ‘One of the big potential areas to intervene on is education.’
The British Heart Foundation cautioned that the differences in blood pressure noted were small but added: ‘Action is needed across all parts of society to give children the best possible start in life and reduce health inequalities.’
Education has also been linked with warding off Alzheimer’s. But it may be the case that when the condition does hit, it hits harder and progresses faster.
Parents warned against giving paracetamol and ibuprofen for mild fever
Proper caution at last
Parents should not give children with a mild fever regular spoonfuls of paracetamol and ibuprofen, doctors advise today, as they warn that doing so could extend their illness or put their health at risk.
A misplaced “fever phobia” in society means parents too frequently use both medicines to bring down even slight temperatures, say a group of American paediatricians, who warn that children can receive accidental overdoses as a result.
As many as half of parents are giving their children the wrong dosage, according to a study carried out by the doctors.
In new guidance, the American Academy of Pediatrics advises that a high temperature is often the body’s way of fighting an infection, and warns parents that to bring it down with drugs could actually lengthen a child’s illness. [Nice to have that rediscovered]
Family doctors too readily advise parents to use the medicines, known collectively as “antipyretics”, according to the authors of the guidance.
GPs also often tell parents to give their children alternate doses of paracetamol and ibuprofen – known as combination therapy – believing the risk of side effects to be minimal.
In its official guidance, the National Institute for Health and Clinical Excellence (Nice) says the use of the drugs “should be considered in children with fever who appear distressed or unwell”.
Although Nice says that both drugs should not “routinely” be given to children with a fever, it states that this approach “may be considered” if the child does not respond to being given just one of them.
Children’s paracetamol solutions such as Calpol and ibuprofen solutions such as Nurofen for Children are sold over the counter in chemists. Recommended dosage quantities vary by age.
There is a range of solutions for different age groups, meaning it is possible for parents with children of different ages to mix up which they are giving.
According to the British National Formulary, which GPs consult when prescribing or advising on medication, children should receive no more than four doses of the right amount of paracetamol in a 24-hour period, and no more than four doses of ibuprofen a day.
In its guidance today, however, theAmerican Academy of Pediatrics notes that both medications have potential side effects and says the risks should be taken seriously.
Doctors, the authors write, should begin “by helping parents understand that fever, in and of itself, is not known to endanger a generally healthy child”. “It should be emphasised that fever is not an illness but is, in fact, a physiological mechanism that has beneficial effects in fighting infection.”
Despite this, the academy says, many parents administer paracetamol or ibuprofen even though there is only a minimal fever, or none at all. “Unfortunately, as many as half of all parents administer incorrect doses,” the authors say. A frequent error is giving children adult-sized doses, while children who are small for their age can also receive doses that are too high even if their parents follow the instructions correctly.
Paracetamol has been linked to asthma, while there have been reports of ibuprofen causing stomach ulcers and bleeding, and leading to kidney problems.
“Questions remain regarding the safety” of combination therapy, say the authors, led by Dr Janice Sullivan, of the University of Louisville Pediatric Pharmacology Research Unit, and Dr Henry Farrar, of the University of Arkansas.
Dr Clare Gerada, the chairman of the Royal College of GPs, said: “In my experience of 20 years as a GP, parents are usually pretty careful. “I think the most important thing to be worried about is keeping medicines out of the reach of children, because some taste quite nice.”
27 February, 2011
Tea gives your brain a lift and reduces tiredness
Since tea contains caffeine, which is a well-known stimulant, I am not at all clear on why this article had to be written
Natural ingredients found in a cup of tea can improve brain power and increase alertness, it is claimed. Researchers looked at the effect of key chemicals found in tea on the mental performance of 44 young volunteers.
The effects of these ingredients, an amino acid called L-theanine – which is also found in green tea – and caffeine at levels typically found in a cup of tea, were compared with a dummy treatment. The active ingredients significantly improved accuracy across a number of switching tasks for those who drank the tea after 20 and 70 minutes, compared with the placebo. The tea drinkers’ alertness was also heightened, the study found.
Tea was also found to reduced tiredness among the volunteers, who were aged under 40, according to the Dutch researchers reporting on their findings in the journal Nutritional Neuroscience.
‘The results suggest the combination helps to focus attention during a demanding cognitive task,’ they said. Previous trials have shown that adding milk to a cup of tea does not affect the drinker’s absorption of flavonoids – or antioxidants – or disrupt the health benefits from these.
Tea drinking has already been linked with lowering the risk of heart disease, cancer and Parkinson’s. Other research shows drinking tea on a regular basis for ten or more years may help improve bone density.
Dr Tim Bond, of the industry-backed Tea Advisory Panel, said the latest findings backed a previous study which showed drinking two cups of black tea ‘improves the ability to react to stimuli and to focus attention on the task in hand’. ‘Taken together, these two studies provide evidence that consumption of black tea improves cognitive function, in particular helping to focus attention during the challenge of a demanding mental task,’ he said.
‘As a result, all this new data adds to the growing science that drinking tea, preferably four cups of tea a day, is good for our health and well being.’
Cannabis ingredient 'restores taste buds and lost pleasure in food to cancer patients'
It is well-known that pot gives users "the munchies" so this is not a big surprise. But the methodology below is best passed over in forgiving silence. I think the experimenters must be regular users
The ingredient that gives cannabis its 'high' can help cancer patients recover their sense of taste, researchers say.
A group of patients who had been treated with chemotherapy for advanced cancer were given capsules that either contained THC - the psychoactive chemical in cannabis - or dummy lookalike pills. The 21 volunteers took the tablets for 18 days and were then asked to fill in questionnaires.
Researchers from the University of Alberta, Canada, found 73 per cent of those who took THC reported an increased liking for food, compared to 30 per cent of the placebo group. Just over half of the THC takers said the medication 'made food taste better' compared to one in 10 of the control group.
While both groups consumed roughly the same amount of calories during the trial, the THC patients said they ate more protein and enjoyed savoury foods more. The THC-takers also reported better quality of sleep and relaxation than in the placebo group.
While the experiment is small scale it is the first to explore the touted qualities of THC through random selection of volunteers and use of a 'control' group by which to make a comparison.
Lead investigator Professor Wendy Wismer said the findings were important because cancer, or its treatment, can cripple appetite and lead to dangerous weight loss.
Many cancer patients eat less as they say meat smells and tastes unpleasant following treatment. 'For a long time, everyone has thought that nothing could be done about this,' Professor Wismer said. 'Indeed, cancer patients are often told to 'cope' with chemosensory problems by eating bland, cold and colourless food. This may well have the result of reducing food intake and food enjoyment.'
Professor Wismer said that doctors should consider THC treatment for cancer patients suffering from loss of taste, smell and appetite.
THC was well tolerated, and in terms of side effects there were no differences between the THC and placebo groups, which suggests that long-term therapy is also an option, she said.
Cannabis is a Class B drug in the UK and is illegal to have, give away or sell.
The study appears in the journal Annals of Oncology, published by the European Society for Medical Oncology.
26 February, 2011
Eating more than three slices of ham a day DOES increase the risk of bowel cancer, say government experts
More epidemiological and theoretical speculation sourced from the sensation-mongering WCRF
You should limit the amount of red meat you eat to the equivalent of three slices of ham, one lamb chop or two slices of roast beef a day, Government advisors have warned. The Scientific Advisory Committee on Nutrition (SACN), published recommendations designed to cut the risk of bowel cancer.
The latest findings are bound to muddy the already confusing debate around the nutritional benefits of red meat. Only last week a British Nutrition Foundation study claimed that the majority of adults ate ‘healthy amounts’ of red meat and there was an ‘inconclusive’ link to cancer. However, the government insists that people who eat 90g or more of red and processed meat a day should cut back. Cutting down to the UK average of 70g a day can help reduce the risk, the study from SACN said.
Red meat contains substances that have been linked to bowel cancer. One compound in particular, haem, which gives red meat its colour, has been shown to damage the lining of the colon in some studies.
The World Cancer Research Fund (WCRF) recommends limiting red meat consumption to 500g a week of cooked weight (about 700g to 750g uncooked). And it says people should avoid processed meats altogether because of the even higher risk of bowel cancer.
The charity estimated 3,800 cases of bowel cancer could be prevented every year if everyone ate less than 70g of processed meat a week. Some 1,900 cases of bowel cancer could also be prevented through cutting red meat consumption to under 70g per week.
Processed meat is generally defined as any meat preserved by smoking, curing or salting, or with chemical preservatives added to it. It is thought this process causes the formation of carcinogens, which can damage cells in the body and allow cancer to develop.
To help consumers the Government published a list today of what is considered a 70g portion of red or processed meat. These are: one medium portion shepherds pie and a rasher of bacon; two standard beef burgers; six slices of salami; one lamb chop; two slices of roast lamb, beef or pork; or three slices of ham. Some 90g of cooked meat is the equivalent to about 130g of uncooked meat, due to the loss of water during cooking.
Men are more likely to eat a lot of red and processed meat - 42 per cent eat more than 90g a day compared to 12 per cent of women.
Interim chief medical officer, Professor Dame Sally Davies, said: 'Following simple diet and lifestyle advice can help protect against cancer.
'Red meat can be part of a healthy balanced diet. It is a good source of protein and vitamins and minerals, such as iron, selenium, zinc and B vitamins. 'But people who eat a lot of red and processed meat should consider cutting down. 'The occasional steak or extra few slices of lamb is fine but regularly eating a lot could increase your risk of bowel cancer.'
Experts estimate the average Briton's lifetime risk of bowel cancer to be about 5 per cent. This rises to 6 per cent if people eat an extra 50g of processed meat a day on top of what they already consume.
Mark Flannagan, chief executive of Beating Bowel Cancer, welcomed the advice. 'The evidence suggests that a diet high in red and processed meat may increase your risk of developing bowel cancer, but the good news is that red meat can still be enjoyed in moderation as part of a healthy balanced diet. 'This combined with an active lifestyle, and awareness of the symptoms and risk factors, could help protect you from the UK's second biggest cancer killer.'
Dr Rachel Thompson, deputy head of science for the World cancer Research Fund, said: 'We welcome the fact that this report recognises the strong evidence that it increases risk of bowel cancer. 'We are also pleased that its suggested maximum intake is similar to the 500g per week (cooked weight) limit that World Cancer Research Fund recommends.
'However, our report made the distinction between red and processed meat and we recommended that while people should limit intake of red meat, they should avoid processed meat. 'This means that we would suggest that people following this new report's guidelines should try and make sure as little as possible of their 70g per day is processed.'
Peter Baker, chief executive of the Men's Health Forum, said: 'Men who enjoy regular breakfast fry-ups or roast beef dinners will be surprised to learn that eating too much red or processed meat might increase their risk of bowel cancer.
'We're not saying men can't occasionally enjoy a bacon sandwich or some sausages for breakfast - but the evidence tells us we need to think about cutting down on how much red and processed meat we're eating. 'This is a health benefit surely worth giving up a few sausages for.'
Last year, experts from the Harvard School of Public Health in the U.S. found that eating processed meats can increase the risk of heart disease and diabetes.
The round-up of 20 studies published worldwide found people who eat processed meats have a 42 per cent higher risk of heart disease and a 19 per cent increased risk of Type 2 diabetes. However, unprocessed red meats, such as beef, pork or lamb, do not raise the risk, the study found.
Another hymn of praise to the virtues of nuts
The fact that antioxidants shorten your lifespan is not mentioned, funnily enough. Even if it's true that they are good for your heart, they are obviously bad for other things
Eating pecan nuts can lower the risk of developing heart disease or cancer, say researchers. A study showed their naturally occurring antioxidants help reduce inflammation in the arteries.
The nuts are particularly rich in one form of the antioxidant vitamin E called gamma-tocopherol, and the study showed that its levels in the body doubled eight hours after eating pecans.
The researchers analysed 16 men and women who ate a sequence of three diets, one of whole pecans, one of pecans blended with water, and a neutral ‘control’ meal. Even after three hours, unhealthy oxidation of ‘bad’ cholesterol in the blood – which can cause heart problems – fell by up to a third.
‘Previous research has shown that pecans contain antioxidant factors. Our study shows these antioxidants are indeed absorbed in the body and provide a protective effect against diseases,’ said Ella Haddad, of California’s Loma Linda University, whose findings were published in The Journal of Nutrition. ‘This protective effect is important in helping to prevent development of various diseases such as cancer and heart disease.’
It is only the latest evidence that nuts can boost health. Walnuts help lower cholesterol, while almonds are a great source of bone-building calcium and Brazil nuts are high in the antioxidant selenium, linked to preventing some cancers.
'Our tests show that eating pecans increases the amount of healthy antioxidants in the body,' said LLU researcher Ella Haddad, associate professor in the School of Public Health department of nutrition. 'This protective effect is important in helping to prevent development of various diseases such as cancer and heart disease.'
Dr Habbad added: 'This study is another piece of evidence that pecans are a healthy food," says Dr. Haddad. 'Previous research has shown that pecans contain antioxidant factors. Our study shows these antioxidants are indeed absorbed in the body and provide a protective effect against diseases.'
Research from Loma Linda University published earlier in the Journal of Nutrition showed that a pecan-enriched diet lowered levels of LDL cholesterol by 16.5 percent - more than twice the American Heart Association's Step I diet, which was used as the control diet in that study. Similarly, the pecan-enriched diet lowered total cholesterol levels by 11.3 percent.
25 February, 2011
SCOTUS limits lawsuits against vaccine makers
A win for all. Legal burdens could easily wipe out vaccine provision
Vaccine makers such as Pfizer are breathing much easier today: The Supreme Court ruled they can't be sued for defective vaccine designs. That puts the kibosh on some 5,000 cases in which parents blame vaccines for their children's autism, and generally gives the pharmaceutical companies much more certainty about their potential liability. The decision was 6-2, with Justice Elena Kagan sitting out. Justices Sonia Sotomayor and Ruth Bader Ginsburg dissented.
When a product hurts someone, one possible way the victim can sue is to claim that the product was designed defectively. Claiming a defective design is the tricky, because products can be inherently dangerous, but still be good products -- chainsaws or cars, for example. While different courts use different tests to determine if a design was defective, the basic idea is to strike a balance between the product's usefulness as intended and the risks it creates.
Let's apply this to a hypothetical vaccine: Imagine it's possible to have a vaccine that has no side effects, but isn't particularly effective at preventing the disease it targets, and another formulation that's extremely effective at disease prevention, but does have side effects, including -- in very rare cases -- horrible ones such as brain damage or even death. Because of society's interest in promoting the use of effective vaccines -- when someone fails to immunize their child, both that child and other people are put at risk, particularly infants and the elderly -- Congress has emphatically endorsed vaccines that are effective but have some rare risks as the better design.
To prevent vaccine makers from being bankrupted by lawsuits over those rare but horrible side effects, Congress created a special vaccine compensation program for victims. If a victim's claim is rejected by that program, the victim can still sue under state law, claiming the vaccine was defective. In the case the Court just decided, the issue was whether Congress allowed all kinds of defective vaccine claims, or just claims a vaccine was defectively manufactured or defectively marketed (i.e.: that the maker failed to warn users of known risks.)
The court heard the case of Hannah Bruesewitz, who developed a seizure disorder and mental disabilities as an infant, shortly after being given Wyeth's DPT vaccine. (Wyeth is now part of Pfizer.) Her family's claim for compensation was rejected by the federal program, and they turned to state court, alleging the vaccine was defectively designed because Wyeth owned a different design that was safer, but chose not to market it. Wyeth disputes the claim that the other design was safer.
In deciding the Bruesewitz's claim was barred, the Court turned to the text and structure of the law creating the federal compensation program. Effectively the decision turned on what the three words "even though" and "unavoidable" meant in this context:
"No vaccine manufacturer shall be liable [for] a vaccine-related injury or death...if the injury or death resulted from side effects that were unavoidable even though the vaccine was properly prepared and was accompanied by proper directions and warnings."
Justice Scalia, a famous textualist, wrote the opinion. The majority decided that the language meant that only manufacturing and marketing defects, which follow the "even though", are allowed. If design defects were allowed, it should have said something like "properly prepared and designed and accompanied..."
The dissent disagreed, arguing that the word "unavoidable" had a special meaning from a law treatise, which changed the analysis, and Congress hadn't clearly said it was preempting the state claims. Finally, the dissenters emphasized that shielding vaccine makers from design defect cases eliminates a powerful incentive for manufactures to keep improving their designs. As to the last argument, Scalia conceded the tort liability motive for design improvement was indeed eliminated by his opinion, but insisted the law provided other incentives.
Scalia, who has a reputation for witty, readable and caustic opinions, clearly reveled in parsing the sentence's structure as well as talking trash (in polite Supreme Court fashion) about Sotomayor and Ginsburg's dissent. For example, Justice Scalia noted that the "even though" clause is known as a "concessive subordinate clause by grammarians" and said things like "dissent's principal textual argument is mistaken ... We do not comprehend ... its reasoning."
Justice Scalia also took a passing swipe at Congress, noting that it had been unnecessarily wordy:
"Petitioners and the dissent contend that the interpretation we propose would render part of [the vaccine law] superfluous: ... ("the injury or death resulted from side effects that were unavoidable even though") is unnecessary. True enough. But the rule against giving a portion of text an interpretation which renders it superfluous does not prescribe that a passage which could have been more terse does not mean what it says."
At the end of the day, vaccine makers win. Society also wins -- especially if the vaccine makers' threats of withdrawing from the business if they lost this case were sincere. But the Bruesewitz family loses, painfully, as their now teenage daughter still suffers from the seizure condition and mental disabilities. And if the dissent is right, and manufacturers will fail to improve their vaccine designs as a result of the decision, we will all lose eventually as side effects that could have been reduced or eliminated continue to hurt people.
Moderate amounts of alcohol protect against heart disease
But the effect is small
Drinking a glass of wine or pint of beer every evening reduces the risk of heart disease by up to a quarter, according to research.
Just days after a warning that Britain faces up to 250,000 extra deaths from liver disease unless its binge-drinking culture is tackled, two reports claim that moderate amounts of alcohol are actually good for the health.
They say that a small glass of wine for women and up to two bottles or one pint of beer can prevent the build-up of bad cholesterol, and so sensible drinkers are at lower risk of developing heart disease than teetotallers.
This is because alcohol taken in moderation increases the amount of “good” cholesterol circulating in the body.
Prof William Ghali, of the University of Calgary, said: “With respect to public health messages there may now be an impetus to better communicate to the public that alcohol, in moderation, may have overall health benefits that outweigh the risks in selected subsets of patients - any such strategy would need to be accompanied by rigorous study and oversight of impacts.”
However Cathy Ross, senior cardiac nurse at the British Heart Foundation, warned: “Drinking more than sensible amounts of alcohol does not offer any protection and can cause high blood pressure, stroke, some cancers and damage to your heart.
“If you don't drink, this is not a reason to start. Similar results can be achieved by being physically active and eating a balanced and healthy diet.”
In the first study, published online by the BMJ on Wednesday, Prof Ghali and colleagues reviews 84 previous studies of alcohol consumption and disease. They compared the number of drinkers and non-drinkers who suffered, or died from, heart disease and stroke, and concluded that “alcohol consumption was associated with lower risk”, of between 14 and 25 per cent.
The second paper, led by Dr Susan Brien at the same Canadian university, looked at 63 existing studies on alcohol consumption and cholesterol and fat levels. It concluded that consumption of one drink (of about 15g of alcohol) for women and two for men was good for the health, and that the benefit was felt regardless of whether beer, wine or spirits was drunk.
The study said that alcohol “significantly increased” high-density lipoprotein cholesterol, the “good” form that cleans blood vessel walls of “bad” cholesterol and returns it to the liver, preventing the build ups that can lead to heart disease.
24 February, 2011
The "too clean" theory of asthma rises again. It even makes the Wall St. Journal!
The germ theory has been in some eclipse in recent years because of some awkward epidemiological facts. For instance, Australian Aborigines often live in extraordinary squalor but don't seem to be protected from anything because of that. In fact they have quite high rates of autoimmune diseases such as diabetes
So how do we evaluate the findings below? It's a bit difficult as the article in NEJM has not yet appeared but there are at least two possibilities. The most favourable to the theory is that it is not the overall bacterial load that matters but rather just some bacteria. So southern German farmhouses might have the helpful bacteria but Aboriginal camps may not. That is not inherently absurd but would be very much in need of proof, considering that both populations have extensive contact with all the world's infective agents via the modern-day "global village".
The second much more skeptical possibility derives from the fact that we are only looking at epidemiology here -- so all causal links are speculative. For instance, it has recently been found that Paracetamol (Tylenol) use in children under 15 months doubles their chance of getting asthma. So maybe the "dirty" farms were less health conscious in general and so used fewer medications, including paracetamol. Isn't epidemiology wonderful?
The possibilities are endless, in fact. It was found last year, for instance, that that receptors for bitter tastes are not confined to the tongue but are also are found in the smooth muscles of the lungs and airways. And bitter tastes RELAX those airways. So in doing any epidemiological comparisons of asthma incidence, we would have to ask whether the different groups used in the research differed in their preferences for bitter drinks, including, of course, beer!
OK. I could go on but I will have mercy at that point
Children living on farms have a lower risk of asthma than children who don't because they are surrounded by a greater variety of germs, according to two large-scale studies published Wednesday.
The prevalence of asthma in the U.S. has doubled over the past 30 years, and one theory for the increase blames urban and suburban living environments that are too clean. The latest findings, published in the New England Journal of Medicine, bolster what is often known as the hygiene theory, which says that contact with bacteria and other microbes is necessary to building a normal immune system.
The key appears to be exposure to a diversity of bugs, not just more of them, according to Markus Ege, an epidemiologist at the Children's Hospital of Munich and first author on the paper that covered both studies. "Bacteria can be beneficial for asthma," said Dr. Ege. "You have to have microbes that educate the immune system. But you have to have the right ones."
Previous research, including some conducted by Dr. Ege's group, has found that children raised on farms exhibit substantially reduced risk for asthma and allergies—lower by 30% or more—than those raised elsewhere. Though scientists had hypothesized that the difference was linked to germs, they also had to determine whether it could be due to other elements of farm life such as fresh air, exposure to farm animals, or dietary factors like drinking raw milk.
The latest study helps untangle that question by providing evidence that the reduction in risk is indeed significantly related to the variety of bacteria and other bugs a child is exposed to, according to James Gern, a professor of pediatrics and medicine at the University of Wisconsin-Madison who wrote an editorial to accompany the paper in the journal but wasn't involved in the study.
In Wednesday's paper, the researchers surveyed and collected samples of house dust in two studies of children from Southern Germany, Austria and Switzerland. One study comprised 6,800 children, about half of whom lived on farms, and the other studied nearly 9,700 children, 16% of whom were raised on a farm. Researchers then examined the dust for presence and type of microbes.
Those living on farms were exposed to a greater variety of bugs and also had a lower risk of asthma. There was evidence that exposure to a particular type of bacteria, known as gram-negative rods, was also related to lower rates of allergic responses.
Identifying which microbes are beneficial to the immune system is important because those germs could help the development of new treatments or vaccines to prevent asthma, Dr. Ege said. His group is now studying some of the microbes in greater detail.
The findings don't yield much in the way of practical suggestions, however. Dr. Ege said it wouldn't help for parents to take their children to a farm two or three times a year or to get a dog or other pet for the purpose of exposing their children to microbes, since the biggest effect appeared to be related to prolonged exposure to cows and pigs.
Could your blood group determine your health?
Since different blood types carry different antigens, it is not surprising that they might vary in their ability to fight different diseases. They may well have evolved to fight the threats most common in their original local environments. In modern populations, however, differences in disease resistance would appear to be small
Could your blood group determine your risk of major cancers, infertility and stomach ulcers, as well as diseases such as cholera and malaria? For years, the idea that blood groups had any medical significance beyond blood transfusions was dismissed by scientists.
It hasn’t been helped by the celebrity’s favourite, the ‘blood group diet’, which claims your blood type determines how your body responds to certain food.
But a growing number of studies is revealing how our blood groups may make us more prone to lethal illnesses — or even protect us from them.
The latest research into blood types shows that having group O blood can lower your risk of heart attacks. Researchers at Pennsylvania University discovered this benefit in a study involving 20,000 people. Their research, to be published in The Lancet, found that most people who have a gene called Adamts7 face a significantly raised risk of suffering a heart attack. But in people with blood group O who have the Adamts7, there is no raised risk.
Dr Muredach Reilly, the lead researcher, says this knowledge may help to develop new therapies for people at risk of heart attacks. Such drugs may mimic the beneficial effect of the O blood group gene.
Only 40 per cent of people in Britain know what their group is, according to the National Blood Service. But in future, we may be far more keen to learn it — and to understand its life-saving implications. Our blood group is determined by genes inherited from our parents.
Millennia of evolution have split human blood into four types: A, B, AB and O — around 44 per cent of Britons are type O, 42 per cent are type A, 10 per cent type B and 4 per cent are AB.
What distinguishes each type are their antigens (the immune defence systems) on the surface of the red blood cells. Each blood group type evolved to provide defences against lethal diseases.
But each has its own weaknesses, too. People with type O blood are at less risk of dying from malaria than people with other blood groups. But they are more vulnerable to cholera and stomach ulcers caused by viruses and bacteria.
For a long time, the study of blood groups and disease was discredited — thanks to the Nazis. Otto Reche, a Nazi German ‘professor of racial science’, claimed in the Thirties that pure Aryans all had blood type A. The main ‘enemy’ blood group was, he said, B type. He used this to identify ‘inferior’ races for persecution during Hitler’s rise to power.
While such claims are scientifically absurd, in Japan there is still widespread discrimination on the grounds of blood group. In the Twenties, Japanese scientists claimed blood groups produced different personalities. The idea became so ingrained that in World War II, the Imperial Army formed battle groups based on blood type.
The idea resurfaced in the Seventies and a rash of Japanese best-sellers has spread the belief that type As are sensitive but anxious; Type Bs are cheerful but focused; Os are outgoing but stubborn; and ABs are arty and unpredictable.
This theory has a dark side. Bura-hara (blood-group harassment) is common in Japan. Company chiefs often consider candidates’ blood types when picking staff. Children at some kindergartens are also divided by blood type. Matchmaking agencies provide blood-type compatibility tests.
Nevertheless, there is serious science behind the idea that blood groups can hold the secret to fighting deadly diseases.
In the Fifties, research at four London hospitals found the risk of developing gastric cancer was much higher for people with blood group A than for those with blood group O. But people with group O had a greater risk of peptic ulcers.
This month, those findings have been confirmed by investigators at Sweden’s Karolinska Institute, which studied more than a million people over a period of 35 years. The lead researcher, Dr Gustaf Edgren, says people with group A may be more susceptible to gastric cancer risks such as smoking, alcohol and use of non-steroidal anti-inflammatory drugs. Type O people may be more vulnerable to a bacterium that can cause peptic ulcers, Helicobacter pylori.
Last October, U.S. scientists showed that a woman’s blood group can affect her chances of becoming pregnant. The study of more than 560 women undertaking fertility treatment found that those with blood type O were up to twice as likely to have a lower egg count and poorer egg quality, which could affect the chances of conceiving. Women with blood group A seemed to be better protected against their egg counts falling over time.
Researcher Edward Nejat, from New York’s Albert Einstein College, says the exact reasons for a link between blood group and ovarian reserve was not clear.
Blood groups have been linked to other reproductive troubles. Last month, a study at Harvard University found that women with AB or B group blood have a raised risk of developing ovarian cancer.
There are also fears that AB blood may double or even treble the risk of pregnant mothers suffering from the potentially lethal blood pressure condition pre-eclampsia. This finding could be harnessed to identify women at higher risk.
Other research has found that people with type AB and B blood have a much higher risk of developing pancreatic cancer.
Meanwhile, people with type O might be less at risk of cancer, but research shows they are also more vulnerable than others to norovirus, the potentially lethal vomiting and diarrhoea bug.
And men with type O might be more prone to piling on the pounds, say Danish researchers. They have found that type O males who are exposed routinely to pollution at work have a significantly raised risk of obesity compared with men of other blood types.
The researchers, at Copenhagen’s Bispebjerg University Hospital, speculate that the pollution sets off chronic inflammatory responses in the men’s bodies that can result in them becoming overweight. It’s a good excuse anyway.
Taken overall, such a weight of medical evidence might prompt us to question why we are not told of the health threats we might face due to our blood type. But in the UK, there is little work in this field.
Professor Mike Murphy, of the NHS Blood and Transplant authority, says: ‘Our colleagues in the U.S. have become increasingly involved in this type of research, particularly in trying to harness the power of blood types to fight infectious diseases. But the interest in Britain is sparse.’
Meanwhile, a lone group of British researchers is trying to turn blood-group science into a bona-fide lifesaver in one area: malaria. The effort is being led by Alex Rowe, an infection specialist at Edinburgh University’s School of Biological Sciences. Her work shows that people with blood group O are resistant to the tropical disease, which kills millions every year.
23 February, 2011
Organic produce 'not as good for your health': Vegetables grown with pesticides contain MORE vitamins
The organic approach to gardening which avoids chemicals will not deliver healthier or more tasty produce, it is claimed.
A controversial study from Which? Gardening suggests produce grown using modern artificial methods may well be better for you.
The claims, which will alarm producers and consumers who put their faith in natural food, follow a two-year study.
Non-organic broccoli was found to have significantly higher levels of antioxidants than organically grown samples. Antioxidants are beneficial chemicals that are said to improve general health and help prevent cancer.
The research found that non-organic potatoes contained more Vitamin C than the organic crop, and expert tasters found that non-organically grown tomatoes had a stronger flavour than the organic samples.
Organic bodies have rejected the claims, insisting the trial was too small to offer meaningful results.
Using mobile phones 'does not increase the risk of cancer'
The only reason this is still an issue is that a lot of conceited people hate anything that is popular and need to feel that they know better
Using a mobile phone does not increase the risk of getting brain cancer, claim British scientists. There has been virtually no change in rates of the disease - despite around 70 million mobile phones being used in the UK.
A study by scientists at the University of Manchester looked at data from the Office of National Statistics on rates of newly diagnosed brain cancers in England between 1998 and 2007. It found no statistically significant change in the incidence of brain cancers in men or women during the nine-year period.
The study, published in the journal Bioelectromagnetics, suggests radio frequency exposure from mobile phone use has not led to a 'noticeable increase' in the risk of developing brain cancers.
Lead researcher Dr Frank de Vocht, an expert in occupational and environmental health in the University of Manchester’s School of Community-Based Medicine, said it was 'unlikely we are at the forefront of a cancer epidemic'.
He said 'Mobile phone use in the United Kingdom and other countries has risen steeply since the early 1990s when the first digital mobile phones were introduced.
'There is an ongoing controversy about whether radio frequency exposure from mobile phones increases the risk of brain cancer. 'Our findings indicate that a causal link between mobile phone use and cancer is unlikely because there is no evidence of any significant increase in the disease since their introduction and rapid proliferation.'
The study says there is no 'plausible biological mechanism' for radio waves to directly damage genes, resulting in cells becoming cancerous. If they are related to cancer, they are more likely to promote growth in an existing brain tumour.
The researchers said they would expect an increase in the number of diagnosed cases of brain cancer to appear within five to 10 years of the introduction of mobile phones and for this to continue as mobile use became more widespread.
The time period studied, between 1998 and 2007, would relate to exposure from 1990 to 2002 when mobile phone use in the UK increased from zero to 65 per cent of households.
The team, which included researchers from the Institute of Occupational Medicine in Edinburgh and Drexel University, Philadelphia, found a small increase in the incidence of cancers in the temporal lobe of 0.6 cases per 100,000 people or 31 extra cases per year in a population of 52 million.
Brain cancers of the parietal lobe, cerebrum and cerebellum in men actually fell slightly between 1998 and 2007. 'Our research suggests that the increased and widespread use of mobile phones, which in some studies was associated to increased brain cancer risk, has not led to a noticeable increase in the incidence of brain cancer in England between 1998 and 2007' said Dr de Vocht.
'It is very unlikely that we are at the forefront of a brain cancer epidemic related to mobile phones, as some have suggested, although we did observe a small increased rate of brain cancers in the temporal lobe.
'However, to put this into perspective, if this specific rise in tumour incidence was caused by mobile phone use, it would contribute to less than one additional case per 100,000 population in a decade.
'We cannot exclude the possibility that there are people who are susceptible to radio-frequency exposure or that some rare brain cancers are associated with it but we interpret our data as not indicating a pressing need to implement public health measures to reduce radio-frequency exposure from mobile phones.'
22 February, 2011
How dumb can officialdom get?
Only one person out of over 1,900 Met AHA's Definition of Ideal Heart Health -- yet it doesn't occur to them that their criteria are wrong. Procrustes obviously has many modern-day followers.
All the nonagenarians tottering around the place must have good hearts. How about using them as a criterion for heart health? That would put the cat among the pigeons! A lot of them smoke, drink, are inactive, grew up on high fat foods etc.
Only one out of more than 1,900 people evaluated met the American Heart Association (AHA) definition of ideal cardiovascular health, according to a new study led by researchers at the University of Pittsburgh School of Medicine.
Their findings were recently published online in Circulation.
Ideal cardiovascular health is the combination of these seven factors: nonsmoking, a body mass index less than 25, goal-level physical activity and healthy diet, untreated cholesterol below 200, blood pressure below 120/80 and fasting blood sugar below 100, explained senior investigator and cardiologist Steven Reis, M.D., associate vice chancellor for clinical research at Pitt.
"Of all the people we assessed, only one out of 1,900 could claim ideal heart health," said Dr. Reis. "This tells us that the current prevalence of heart health is extremely low, and that we have a great challenge ahead of us to attain the AHA's aim of a 20 percent improvement in cardiovascular health rates by 2020."
As part of the Heart Strategies Concentrating on Risk Evaluation (Heart SCORE) study, the researchers evaluated 1,933 people ages 45 to 75 in Allegheny County with surveys, physical exams and blood tests. Less than 10 percent met five or more criteria; 2 percent met the four heart-healthy behaviors; and 1.4 percent met all three heart-healthy factors. After adjustment for age, sex and income level, blacks had 82 percent lower odds than whites of meeting five or more criteria.
A multipronged approach, including change at the individual level, the social and physical environment, policy and access to care, will be needed to help people not only avoid heart disease, but also attain heart health, Dr. Reis said.
"Many of our study participants were overweight or obese, and that likely had a powerful influence on the other behaviors and factors," he noted. "Our next step is to analyze additional data to confirm this and, based on the results, try to develop a multifaceted approach to improve health. That could include identifying predictors of success or failure at adhering to the guidelines."
Daily pill may stop the ringing in your ears
The trials of this theory have not yet even begun!
A mineral found in spinach and other green leafy vegetables is being used to treat people with chronic tinnitus — characterised by an inexplicable ringing or buzzing in the ears. Researchers believe the mineral magnesium plays a key role in protecting our hearing system and that supplements taken daily will reduce tinnitus.
This condition is believed to permanently affect one in ten adults, with one in three people experiencing it at some point in their life. The clinical trial of 40 patients, at the Mayo Clinic in Arizona, America, is due to start this month. The trial subjects will be split into two groups; one will take a 535mg magnesium tablet every day, while the other group will take a daily placebo pill.
The trial follows previous studies that linked low levels of magnesium in the body to a higher risk of noise-induced hearing loss.
Tinnitus is usually accompanied by some hearing loss and researchers believe the same biological malfunction in our body’s hearing system may cause both conditions.
Tinnitus is triggered by a range of factors, such as ear infections, adverse reactions to some medications (such as aspirin), high blood pressure or age-related hearing damage. Prolonged exposure to loud noise can also trigger it and sufferers include musicians Phil Collins and Eric Clapton.
Tinnitus can affect one or both ears and there is no cure. The condition is linked to problems with hair cells in the inner ear. These cells vibrate in response to sound waves and these vibrations are translated into electrical signals which are sent to the brain via nerves.
When these cells become weakened or damaged — through infection or over-exposure to loud noise, for instance — they send a constant stream of abnormal signals along the nerves. The brain interprets these signals as sounds of ringing, humming or buzzing. Damage to these hair cells also causes deafness.
Magnesium is needed to help maintain normal nerve function in the body and good sources include green leafy vegetables, bread and dairy products.
The UK recommended daily intake is 300mg. Higher doses can trigger diarrhoea, stomach cramps and cause complications in patients with kidney disease. Therefore, they should be taken only under medical supervision, say the scientists.
The team believe a lack of the mineral in the hair cells may contribute to tinnitus.
One function of magnesium is to stop too much calcium being released by the body. Calcium causes small blood vessels to narrow and a lack of blood flow to the hair cells is thought to contribute to the condition as it reduces their supply of oxygen and nutrients. Another theory is that magnesium blocks glutamate, a brain chemical responsible for sending signals between nerve cells.
Although this chemical is important for relaying messages throughout the body, too much of it can damage nerve cells, especially in the body’s hearing system. Previous studies suggest that exposure to loud noise triggers the over-production of glutamate.
Dr Ralph Holme, head of biomedical research at the Royal National Institute for Deaf People (RNID), says: ‘Everyday life can often be frustrating and distressing for people experiencing tinnitus, and RNID is keen to see effective treatments developed to cure or treat the condition. ‘Only a small group of people are being tested in this study, so it will be hard for researchers to show whether a magnesium supplement can meaningfully reduce the effects of tinnitus. But, the research may encourage future larger-scale trials.’
Elsewhere, researchers are testing a new treatment for hearing loss in people who listen to loud music or work in noisy environments. The trial, which is being conducted in the U.S., Spain and Sweden, will involve 60 young people who use MP3 players, 25 arms officers taking part in combat training in Sweden, 130 Nato soldiers and 120 factory workers. Half of the group will be given a placebo, while the other half will be given a daily pill containing the antioxidants beta-carotene and vitamins C and E. The team hope these antioxidants will help protect the hearing cells in the ear.
The group, who have good hearing, will take the pill for two years and will be tested throughout this time.
Animal studies have found this combination of compounds can be effective in protecting hearing loss. This is the first trial to test the theory in humans.
21 February, 2011
C-section puts children at food risk?
This idea has been grumbling on for years. Nobody seems to mention that less healthy women might be more likely to need a Caesarian and that it might be the poorer average maternal health that leads to poorer average child health -- nothing to do with the delivery method
GIVING birth by caesarean section increases the risk of your child suffering from food allergies, an expert has warned.
Pediatric allergy specialist Dr Peter Smith is urging expectant mothers to consider a vaginal delivery because of growing evidence a c-section can "significantly increase the risk of your child suffering from an allergy to cow's milk".
Admissions to hospital emergency departments for allergic reactions have increased by 500 per cent since 1990 in Australia. "It is at epidemic proportions," Dr Smith said of the massive rise in food allergies, likely to be attributed to several causes rather than one.
But symptomatic food allergy was found to occur more frequently in children born by c-section. There has been a 30 per cent growth in caesareans in the past decade in Australia.
"Several studies have shown a difference in the composition of the gastrointestinal flora of children with food allergies compared to those without," Dr Smith said. "When a child moves through the birth canal, they ingest bacteria and become naturally inoculated through a small mouthful of secretions. "The oral ingestion of those healthy bugs is the first bacteria that comes into their system." Dr Smith said that first bacteria entering the body established "the population".
Not only does Australia have one of the highest prevalence of allergic disorders in the developed world, but recent studies have demonstrated a doubling in some conditions such as allergic rhinitis (hay fever), eczema and potentially dangerous anaphylaxis. Asthma, hay fever, chronic sinusitis and "other allergy" comprise four of the top 10 most common long-term, self-reported illnesses in young people aged 12-24 in Australia.
Dr Smith said the next best thing to a "natural" birth was to follow birth with breast feeding. "Breast milk contains lots of healthy bugs (probiotics) to promote the growth of healthy bacteria and assist your child's immune system in the first few week's of life," he said.
City life is making us sick, study warns
Pure anecdote, without even a pretence of research
CITY slickers juggling phones, computers and the stresses of modern life are being struck down by a new condition called "urban mental health", an international mental health conference heard yesterday.
In the next few decades it will be the single biggest issue facing those in big cities who may not realise their hectic lifestyles are adding to their stress, which could lead to a mental illness.
Compounding the problem is that many people are living in units on their own and parks and backyards are disappearing, causing people to be cut off from society.
One in five Australians is diagnosed each year with a mental condition of some sort, from anxiety and depression to more serious conditions such as schizophrenia, the conference at St Vincent's Hospital in Sydney was told.
Faces in the Street Urban Mental Health Research Institute director Professor Kay Wilhelm said many people were suffering chronic stress and placing themselves at risk. "We've heard there are problems living in the city which are probably becoming more so with stress, pollution etcetera," she said.
"And it's thought that in terms of the social determinants of mental health, one of the underlying factors is being chronically stressed by a whole lot of things.
"Urban mental health is really about the particular mental health issues that have to do with people living in the inner city, it's not really so much the suburbs."
In order to combat urban mental health, town planners and developers are being urged to consider community interaction and encourage meeting spots in their designs.
University of NSW Faculty of Built Environment Associate Professor Susan Thompson said the rise in community gardens was helping to bring people outdoors. "A lot of people in the city live on their own. Others don't have backyards, so they are not out in the garden or interacting with neighbours," she said. "It is about designing cities and letting residents have an input into things that will make them happy."
20 February, 2011
Red meat DOES increase cancer risk, new report will say
"Although the evidence is not conclusive". Well what is it then? Speculation is what it is -- motivated by the fact that meat is popular. The "superior" people will attack ANYTHING that is popular
Britons should cut their consumption of red and processed meat to reduce the risk of bowel cancer, scientific experts are expected to recommend in a report. The Scientific Advisory Committee on Nutrition (SACN) was asked by the Department of Health to review dietary advice on meat consumption as a source of iron.
In a draft report published in June 2009 the committee of independent experts said lower consumption of red and processed meat would probably reduce the risk of colorectal cancer.
The committee said: 'Although the evidence is not conclusive, as a precaution, it may be advisable for intakes of red and processed meat not to increase above the current average (70g/day) and for high consumers of red and processed meat (100g/day or more) to reduce their intakes.'
A daily total of 70g is equivalent to about three rashers of bacon.
The Sunday Telegraph said the full report, to be published within days, was expected to echo the committee's draft report.
The World Cancer Research Fund already recommends people limit their intake of red meat, including pork, beef, lamb and goat, to 500g a week
The World Cancer Research Fund already recommends people limit their intake of red meat, including pork, beef, lamb and goat, to 500g a week
A Department of Health spokeswoman said: 'The DH committee of independent experts on nutrition will shortly publish their final report on iron and health.'
The World Cancer Research Fund already recommends people limit their intake of red meat, including pork, beef, lamb and goat, to 500g a week. The fund also advises consumers to avoid too much processed meat, including hot dogs, ham, bacon and some sausages and burgers.
It follows a review by the British Nutrition Foundation last week which suggested demolished the ‘myths and misconceptions’ about the meat, saying that most people eat healthy amounts which are not linked to greater risk of disease.
Modern farming methods have cut fat levels, which can be even lower than chicken, while red meat provides high levels of vital nutrients, including iron.
A vegetarian having a Cheddar cheese salad will eat seven times more fat, pound for pound, than lean red meat contains, said the review which looks at current evidence on health and red meat and found no evidence of ‘negative health effects’.
The Ultimate in Nanny-State Paternalism
Aside from the air we breathe, nothing is more important than the food and drink we consume. Not healthcare, not employment, not housing — nothing. Obviously, the best healthcare, the highest-paying job, and the biggest mansion in the world can’t do anything for you if you don’t eat. For someone to dictate to someone else the food and drink he should and shouldn’t consume is the ultimate in paternalism; for the state to tell someone the food and drink he should and shouldn’t consume is the ultimate in nanny-state paternalism.
Although the government’s war on poverty has been around about fifty years, its war on drugs about forty years, and its war on terrorism about ten years, it was only last year that the government declared war on childhood obesity. First Lady Michelle Obama has made this latest war her signature issue. “Obesity in this country is nothing less than a public health crisis,” said the president’s wife. She further claims that because military leaders say that one in four young people are unqualified for military service because of their weight, “childhood obesity isn’t just a public health threat, it’s not just an economic threat, it’s a national security threat as well.”
But the first lady is not alone. To help fight the war on childhood obesity, Congress last year passed, and President Obama signed into law, the Healthy Hunger-Free Kids Act. This new law, which amends the Child Nutrition Act, the Food and Nutrition Act, and the Richard B. Russell National School Lunch Act, gives the government more power to decide what kinds of foods can be sold at schools. School-sponsored fundraisers like candy sales are exempt, but only if they are “infrequent within the school.”
What many Americans probably don’t realize is that the federal government is not just concerned about what children eat in school. Since 1980, and every five years since then, the Department of Agriculture (USDA) and the Department of Health and Human Services (HHS) have joined forces to publish Dietary Guidelines for Americans. The 112-page seventh edition dated 2010 has just been published. It provides nutritional guidelines for all Americans two years and older. It is based on the 453-page Report of the Dietary Guidelines Advisory Committee on the Dietary Guidelines for Americans, 2010.
This edition of Dietary Guidelines for Americans recommends that Americans reduce their daily sodium intake, as well as their consumption of saturated fat, trans fat, cholesterol, added sugars, refined grains, and alcohol. It recommends that Americans increase their consumption of vegetables, fruit, whole grains, fat-free or low-fat milk and milk products, and seafood. It also recommends that Americans choose a variety of protein foods and foods that contain more potassium, fiber, calcium, and vitamin D. And, of course, it is also recommended that Americans “increase physical activity and reduce time spent in sedentary behaviors.”
There are two problems with these dietary guidelines, one nutritional and one philosophical.
Some physicians, nutritionists, and health professionals would strongly disagree with some of what is recommended in the Guidelines. For example, the demonization of cholesterol, butter, saturated fat, and unpasteurized dairy products, the dismissal of the glycemic index and the recommendation that 45 to 65 percent of one’s caloric intake should be from carbohydrates, and the lack of any warning about the dangers of aspartame, soy, and genetically modified foods. In fact, some of the above individuals blame the government itself for contributing to the current obesity and diabetes epidemics because it accepted the “lipid hypothesis” and the “cholesterol myth” that links dietary fat to coronary heart disease and recommended an unhealthy excess of carbohydrates in the form of bread, cereal, rice, and pasta at the bottom of its food pyramid.
There is no question that obesity is a growing problem in America. If government figures in the Dietary Guidelines for Americans are to be believed, in 2008, 10 percent of children ages 2 to 5 were obese, 20 percent of children ages 6 to 11 were obese, 18 percent of adolescents were obese, and 34 percent of adults were obese. A visit to your local buffet will probably confirm these figures.
But even if the government recruited the best and brightest nutritional scientists to solve the deepest and darkest mysteries of metabolism, diet, nutrition, exercise, and weight loss, even if they came up with the perfect diet to ensure that every American leads a long and healthy life, even if they won the war on obesity, and even if they did their work without government funding — there would still be a problem with the government’s issuing dietary guidelines.
It’s not that libertarians are indifferent to the obesity epidemic, unconcerned about the tragedy of childhood obesity, and dismissive of the health risks associated with being obese.
The more important issue is the role of government in the family and society. It is just simply not the purpose of government to issue nutrition guidelines, make food pyramids, wage war on obesity, conduct scientific research, subsidize agriculture, promote or demonize certain foods, monitor school lunches, ban unpasteurized dairy products, encourage healthy eating and exercise, regulate food production and labeling, and gather statistics on obesity.
And unlike programs like Social Security, which some people say we just can’t abolish because there is no free-market alternative, in the case of diet and nutrition there are already scores if not hundreds of private organizations in existence offering analysis and advice on a myriad of health-, medical-, food-, exercise-, nutrition-, and diet-related subjects.
But, it is argued, with so many organizations offering such a variety of opinions there is no way to know what is right and so, it is claimed, we need the Departments of Agriculture and Health and Human Services to serve as the final arbiter. And what about the people who are just too lazy or too mentally deficient to do any reading and research on their own? Don’t we need the government to take care of those people by issuing things like dietary guidelines?
But how do we know that the government will get it right? Just look at how many times the Food and Drug Administration has gotten it wrong on drug policy with deadly consequences for tens of thousands of Americans. And what about those people who are just too lazy or too mentally deficient to read and follow the government’s pronouncements and guidelines? Should the state spoon-feed them every day and force them to exercise?
Once the government dictates to us the food and drink we should and shouldn’t consume, there is no stopping its reach into the family and society. And as Ludwig von Mises pointed out:
It is a fact that no paternal government, whether ancient or modern, ever shrank from regimenting its subjects’ minds, beliefs, and opinions. If one abolishes man’s freedom to determine his own consumption, one takes all freedoms away.
The issue is one of freedom. Freedom to consume or not to consume. Freedom to exercise or not to exercise. Freedom to make one’s own health and welfare decisions. Freedom to not have to fund the FDA, USDA, and HHS bureaucracies. Freedom from a nanny state. And yes, freedom to be obese.
As C. K. Chesterton reminds us:
The free man owns himself. He can damage himself with either eating or drinking; he can ruin himself with gambling. If he does he is certainly a damn fool, and he might possibly be a damned soul; but if he may not, he is not a free man any more than a dog.
The new Dietary Guidelines for Americans should be taken with a grain of salt, but no more than a grain lest you fun afoul of the government-recommended daily allowance.
19 February, 2011
Hurrah - eating red meat is good for you! After all the warnings, Sunday roast not linked to heart disease
No details are given of the study below but all the evidence I have seen that opposes meat eating is very weak -- motivated more by vegetarian convictions than anything else
After years of worrying that tucking into red meat could lead to a heart attack or cancer, you can relax and enjoy the Sunday roast, say researchers. A report demolishes the ‘myths and misconceptions’ about the meat, saying that most people eat healthy amounts which are not linked to greater risk of disease.
Modern farming methods have cut fat levels, which can be even lower than chicken, while red meat provides high levels of vital nutrients, including iron.
A vegetarian having a Cheddar cheese salad will eat seven times more fat, pound for pound, than lean red meat contains, says a review by the British Nutrition Foundation.
However, the World Cancer Research Fund, which advises people to curb red meat consumption and cut out processed meat, disputed the findings. [They would. Scares are meat and potatoes to them]
The 77-page review, which looks at current evidence on health and red meat, found no evidence of ‘negative health effects’. It shows on average men in the UK eat 96g of red meat and processed meat a day and women are eating 57g.
Those eating more than 140g a day are advised by the Scientific Advisory Committee on Nutrition to cut down, as these levels are linked to disease. There has been a cut in consumption over the last 30 years, with Britons eating less than many other European countries including Spain, Italy, France, Sweden and the Netherlands.
The review says there is ‘no conclusive link’ between cardiovascular disease and red meat, which actually contains some fatty acids that may protect the heart. At current levels of average consumption, there also is no evidence of a link to cancer, it says.
Cooking methods which overdo or char the meat are a much more likely cause of any link with bowel cancer, says the review.
Dr Carrie Ruxton, an independent dietician and member of the Meat Advisory Panel, which is supported by a grant from the meat industry, said: ‘This review highlights that eating red meat in moderation is an important part of a healthy balanced diet.
‘It also lays to rest many of the misconceptions about meat and health. People have been told they can’t eat it and they feel guilty when they do, but given that current intakes, on average, are well within health targets, there is no reason to eat less red meat if you enjoy it.’ An average slice of ham is 23g, beef 45g and a thick slice of lamb 90g. A small piece of steak is 100g.
Dr Ruxton said: ‘There is less saturated fat in a grilled pork steak than a grilled chicken breast with the skin left on.’
Although meat eaters often have more body fat than vegetarians, the review says it is impossible to attribute this to shunning meat as vegetarians tend to have more health-conscious lifestyles.
Dr Ruxton said many young women were iron-deficient and should be eating more red meat, but she advised that processed meat should be no more than an occasional treat. ‘You don’t need red meat every day, people should be eating fish twice a week, but if you ate a slice of red meat in a sandwich daily you can eat a portion of red meat for dinner up to four times a week and still stay within healthy limits,’ she said.
Since 2006 researchers have been giving hollow warnings about red meat. Professor Martin Wiseman, medical and scientific adviser for World Cancer Research Fund, said the study was being promoted by the meat industry, but added: ‘This paper is not a systematic review of the evidence and does not change the fact that there is convincing evidence that red and processed meat increase risk of bowel cancer. ‘This is why we recommend limiting red meat to 500g cooked weight per week and avoiding processed meat.
‘It is true that red meat contains valuable nutrients and this is why we do not recommend avoiding it altogether. But to suggest, as the authors of this review have done, that there is “no evidence” that a moderate intake of lean red meat has any negative health effects is wrong.
‘Essentially, the public has a choice between believing our findings – which are those of an independent panel of scientists after a systematic and transparent review of the complete global evidence – or the conclusions of this review.’
The review was published in the Nutritional Bulletin, the journal of the British Nutrition Foundation, a charity with funding from various sources including the food industry.
The tantalising evidence that belief in God makes you happier and healthier
I don't think there is much doubt that Christian religious belief de-stresses people. That alone could account for the correlations summarized below. And militant atheists seem such angry people -- and there is little doubt that chronic anger is bad for your heart
God has had a tough time over the past few years. On TV, in newspapers and on the internet, the debate as to whether faith has any relevance in a sceptical modern world has been as ubiquitous as it has been vigorous.
And it has been pretty clear which side is the most splenetic. From Richard Dawkins’ powerful atheist polemics to Christopher Hitchens’ public derision of the Roman Catholic Tony Blair and Stephen Hawking’s proclamation that the universe ‘has no need for God’, it seems that unbelievers have had the dwindling faithful on the run.
As research for my latest novel, Bible Of The Dead, I have spent months investigating the science of faith versus atheism, and discovered startling and unexpected evidence. It might just change the way you think about the whole debate, as it has changed my view.
I am not a religious zealot. On the contrary, I was a teenage atheist. And although in adulthood I have had a vague and fuzzy feeling that ‘there must be something out there’, I was never a regular church-goer. But what I have discovered, on my voyage through the science of faith, has astonished me.
My journey began a couple of years ago when I was travelling in Utah, the home of Mormonism. During my first week there, I approached this eccentric American religion with a typically European cynicism. I teased Mormons about their taste in ‘spiritual undergarments’; I despaired at being unable to find a decent cappuccino (Mormons are forbidden coffee, as well as alcohol, smoking, tea and premarital sex).
But then I had something of an epiphany. One night, after a long dinner, I was walking back to my hotel in downtown Salt Lake City at 2am and I suddenly realised: I felt safe. As any transatlantic traveller knows, this is a pretty unusual experience in an American city after midnight.
Why did I feel safe? Because I was in a largely Mormon city, and Mormons are never going to mug you. They might bore or annoy you when they come knocking on your door, touting their faith, but they are not going to attack you.
The Mormons’ wholesome religiousness, their endless and charitable kindliness, made their city a better place. And that made me think: Why was I so supercilious about such happy, hospitable people? What gave me the right to sneer at their religion? From that moment I took a deeper, more rigorous interest in the possible benefits of religious faith. Not one particular creed, but all creeds. And I was startled by what I found.
For a growing yet largely unnoticed body of scientific work, amassed over the past 30 years, shows religious belief is medically, socially and psychologically beneficial.
In 2006, the American Society of Hypertension established that church-goers have lower blood pressure than the non-faithful. Likewise, in 2004, scholars at the University of California, Los Angeles, suggested that college students involved in religious activities are more likely to have better mental and emotional health than those who do not. Meanwhile, in 2006, population researchers at the University of Texas discovered that the more often you go to church, the longer you live. As they put it: ‘Religious attendance is associated with adult mortality in a graded fashion: there is a seven-year difference in life expectancy between those who never attend church and those who attend weekly.’
Exactly the same outcome was recently reported in the American Journal of Public Health, which studied nearly 2,000 older Californians for five years. Those who attended religious services were 36 per cent less likely to die during this half-decade than those who didn’t. Even those who attended a place of worship irregularly — implying a less than ardent faith — did better than those who never attended.
Pretty impressive. But there’s more; so much more that it’s positively surreal. In 1990, the American Journal of Psychiatry discovered believers with broken hips were less depressed, had shorter hospital stays and could even walk further when they were discharged compared to their similarly broken-hipped and hospitalised, but comparatively heathen peers.
It’s not just hips. Scientists have revealed that believers recover from breast cancer quicker than non-believers; have better outcomes from coronary disease and rheumatoid arthritis; and are less likely to have children with meningitis.
Intriguing research in 2002 showed that believers have more success with IVF than non-believers. A 1999 study found that going to a religious service or saying a few prayers actively strengthened your immune system. These medical benefits accrue even if you adjust for the fact that believers are less likely to smoke, drink or take drugs.
And faith doesn’t just heal the body; it salves the mind, too. In 1998, the American Journal of Public Health found that depressed patients with a strong ‘intrinsic faith’ (a deep personal belief, not just a social inclination to go to a place of worship) recovered 70 per cent faster than those who did not have strong faith. Another study, in 2002, showed that prayer reduced ‘adverse outcomes in heart patients’.
But perhaps this is just an American thing? After all, those Bible-bashing Yanks are a bit credulous compared to us more sceptical Europeans, aren’t they?
Not so. In 2008, Professor Andrew Clark of the Paris School of Economics and Doctor Orsolya Lelkes of the European Centre for Social Welfare Policy and Research conducted a vast survey of Europeans. They found that religious believers, compared to non-believers, record less stress, are better able to cope with losing jobs and divorce, are less prone to suicide, report higher levels of self-esteem, enjoy greater ‘life purpose’ and report being more happy overall.
What is stunning about this research is that the team didn’t go looking for this effect — it came to them unexpectedly. ‘We originally started the research to work out why some European countries had more generous unemployment benefits than others,’ says Professor Clark. But as they went on, the pattern of beneficial faith presented itself. ‘Our analysis suggested religious people suffered less psychological harm from unemployment than the non-religious. Believers had higher levels of life satisfaction.’
So what’s going on? How does religion work this apparent magic? One of the latest surveys to suggest that religious people are happier than the non-religious was conducted by Professors Chaeyoon Lim and Robert Putnam, from Harvard, and published last year.
They discovered that many of the health benefits of religion materialise only if you go to church regularly and have good friends there. In other words, it’s the ‘organised’ part of organised religion that does a lot of the good stuff. Going to a friendly church, temple or mosque gives you a strong social network and a ready-made support group, which in turn gives you a more positive outlook on life — and offers vital help in times of need. The Harvard scientists were so startled by their findings that they considered altering their own religious behaviour.
As Professor Lim said: ‘I am not a religious person, but . . . I personally began to think about whether I should go to church. It would make my mum happy.’
But if the ‘congregation’ effect is one explanation for the good health of churchgoers, it’s not the only one. Other surveys have found that intrinsic faith is also important.
For instance, a study of nearly 4,000 older adults for the U.S. Journal of Gerontology revealed that atheists had a notably increased chance of dying over a six-year period than the faithful. Crucially, religious people lived longer than atheists even if they didn’t go regularly to a place of worship. This study clearly suggests there is a benefit in pure faith alone — perhaps this religiousness works by affording a greater sense of inner purpose and solace in grief.
This begs the question: Given all this vast evidence that religion is good for you, how come the atheists seem so set against it? They pride themselves on their rationality, yet so much of the empirical evidence indicates that God is good for you. Surely, then, it is the atheists, not the devout, who are acting irrationally?
All this will come as no surprise to many students of genetics and evolution, who have long speculated that religious faith might be hard- wired into the human mind. For instance, twin studies (research on identical siblings who are separated at birth) show that religion is a heritable characteristic: if one twin is religious, the other is likely to be a believer as well, even when raised by different parents.
Neurologists are making exciting progress in locating the areas of the brain, primarily the frontal cortex, ‘responsible’ for religious belief — parts of the brain that seem designed to accommodate faith. This research even has its own name: neurotheology.
Why might we be hard-wired to be religious? Precisely because religion makes us happier and healthier, and thus makes us have more children. In the purest of Darwinian terms, God isn’t just good for you, He’s good for your genes, too.
All of which means that, contrary to expectation, it is the atheists who are eccentric, flawed and maladaptive, and it’s the devout who are healthy, well-adjusted and normal.
Certainly, in purely evolutionary terms, atheism is a blind alley. Across the world, religious people have more children than non-religious (go forth and multiply!), while atheist societies are the ones with the lowest birth rates.
The Czech Republic is a classic example. It proclaims itself the most atheist country in Europe, if not the world; it also has a puny birthrate of 1.28 per woman, one of the lowest on the planet (so soon there won’t be any godless Czechs to proclaim their atheism).
The existence of atheism is therefore something of an anomaly. But then again, anomalies are not unknown in evolution. Think of the dodo or the flightless parrot, doomed to extinction. Are atheists similarly blighted? Are Richard Dawkins and his type destined to vanish off the face of the Earth — the victims of their own intellectual arrogance?
That’s not for me to say; it’s for you to ponder. All I do know is that reassessing the research has changed the way I think about faith. These days I go to church quite a lot, especially when I am travelling and researching my books. For instance, the other day I found myself in Cambridge — the home of Stephen Hawking — and took the opportunity to do some sightseeing of the city’s intellectual landmarks.
I strolled by the labs where Hawking does his brilliant work, popped into the pub where they announced the discovery of DNA and admired the library where Charles Darwin studied. As I did, I was in awe at the greatness of Man’s achievements. And then I went to Evensong at King’s College Chapel, and it was beautiful, sublime and uplifting. And I felt a very different kind of awe.
Sneer at faith all you like. Just don’t assume science is on your side.
18 February, 2011
Why Almost Everything You Hear About Medicine Is Wrong
If you follow the news about health research, you risk whiplash. First garlic lowers bad cholesterol, then—after more study—it doesn’t. Hormone replacement reduces the risk of heart disease in postmenopausal women, until a huge study finds that it doesn’t (and that it raises the risk of breast cancer to boot). Eating a big breakfast cuts your total daily calories, or not—as a study released last week finds. Yet even if biomedical research can be a fickle guide, we rely on it.
But what if wrong answers aren’t the exception but the rule? More and more scholars who scrutinize health research are now making that claim. It isn’t just an individual study here and there that’s flawed, they charge. Instead, the very framework of medical investigation may be off-kilter, leading time and again to findings that are at best unproved and at worst dangerously wrong. The result is a system that leads patients and physicians astray—spurring often costly regimens that won’t help and may even harm you.
It’s a disturbing view, with huge im-plications for doctors, policymakers, and health-conscious consumers. And one of its foremost advocates, Dr. John P.A. Ioannidis, has just ascended to a new, prominent platform after years of crusading against the baseless health and medical claims. As the new chief of Stanford University’s Prevention Research Center, Ioannidis is cementing his role as one of medicine’s top mythbusters. “People are being hurt and even dying” because of false medical claims, he says: not quackery, but errors in medical research.
This is Ioannidis’s moment. As medical costs hamper the economy and impede deficit-reduction efforts, policymakers and businesses are desperate to cut them without sacrificing sick people. One no-brainer solution is to use and pay for only treatments that work.
But if Ioannidis is right, most biomedical studies are wrong.
In just the last two months, two pillars of preventive medicine fell. A major study concluded there’s no good evidence that statins (drugs like Lipitor and Crestor) help people with no history of heart disease. The study, by the Cochrane Collaboration, a global consortium of biomedical experts, was based on an evaluation of 14 individual trials with 34,272 patients. Cost of statins: more than $20 billion per year, of which half may be unnecessary. (Pfizer, which makes Lipitor, responds in part that “managing cardiovascular disease risk factors is complicated”).
In November a panel of the Institute of Medicine concluded that having a blood test for vitamin D is pointless: almost everyone has enough D for bone health (20 nanograms per milliliter) without taking supplements or calcium pills. Cost of vitamin D: $425 million per year.
Ioannidis, 45, didn’t set out to slay medical myths. A child prodigy (he was calculating decimals at age 3 and wrote a book of poetry at 8), he graduated first in his class from the University of Athens Medical School, did a residency at Harvard, oversaw AIDS clinical trials at the National Institutes of Health in the mid-1990s, and chaired the department of epidemiology at Greece’s University of Ioannina School of Medicine.
But at NIH Ioannidis had an epiphany. “Positive” drug trials, which find that a treatment is effective, and “negative” trials, in which a drug fails, take the same amount of time to conduct. “But negative trials took an extra two to four years to be published,” he noticed. “Negative results sit in a file drawer, or the trial keeps going in hopes the results turn positive.” With billions of dollars on the line, companies are loath to declare a new drug ineffective. As a result of the lag in publishing negative studies, patients receive a treatment that is actually ineffective. That made Ioannidis wonder, how many biomedical studies are wrong?
His answer, in a 2005 paper: “the majority.” From clinical trials of new drugs to cutting-edge genetics, biomedical research is riddled with incorrect findings, he argued. Ioannidis deployed an abstruse mathematical argument to prove this, which some critics have questioned. “I do agree that many claims are far more tenuous than is generally appreciated, but to ‘prove’ that most are false, in all areas of medicine, one needs a different statistical model and more empirical evidence than Ioannidis uses,” says biostatistician Steven Goodman of Johns Hopkins, who worries that the most-research-is-wrong claim “could promote an unhealthy skepticism about medical research, which is being used to fuel anti-science fervor.”
Even a cursory glance at medical journals shows that once heralded studies keep falling by the wayside. Two 1993 studies concluded that vitamin E prevents cardiovascular disease; that claim was overturned by more rigorous experiments, in 1996 and 2000.
A 1996 study concluding that estrogen therapy reduces older women’s risk of Alzheimer’s was overturned in 2004.
Numerous studies concluding that popular antidepressants work by altering brain chemistry have now been contradicted (the drugs help with mild and moderate depression, when they work at all, through a placebo effect), as has research claiming that early cancer detection (through, say, PSA tests) invariably saves lives.
The list goes on.
Despite the explosive nature of his charges, Ioannidis has collaborated with some 1,500 other scientists, and Stanford, epitome of the establishment, hired him in August to run the preventive-medicine center. “The core of medicine is getting evidence that guides decision making for patients and doctors,” says Ralph Horwitz, chairman of the department of medicine at Stanford. “John has been the foremost innovative thinker about biomedical evidence, so he was a natural for us.”
Ioannidis’s first targets were shoddy statistics used in early genome studies. Scientists would test one or a few genes at a time for links to virtually every disease they could think of. That just about ensured they would get “hits” by chance alone. When he began marching through the genetics literature, it was like Sherman laying waste to Georgia: most of these candidate genes could not be verified.
The claim that variants of the vitamin D–receptor gene explain three quarters of the risk of osteoporosis? Wrong, he and colleagues proved in 2006: the variants have no effect on osteoporosis.
That scores of genes identified by the National Human Genome Research Institute can be used to predict cardiovascular disease? No (2009). That six gene variants raise the risk of Parkinson’s disease? No (2010). Yet claims that gene X raises the risk of disease Y contaminate the scientific literature, affecting personal health decisions and sustaining the personal genome-testing industry.
Statistical flukes also plague epidemiology, in which researchers look for links between health and the environment, including how people behave and what they eat. A study might ask whether coffee raises the risk of joint pain, or headaches, or gallbladder disease, or hundreds of other ills. “When you do thousands of tests, statistics says you’ll have some false winners,” says Ioannidis. Drug companies make a mint on such dicey statistics. By testing an approved drug for other uses, they get hits by chance, “and doctors use that as the basis to prescribe the drug for this new use. I think that’s wrong.”
Even when a claim is disproved, it hangs around like a deadbeat renter you can’t evict. Years after the claim that vitamin E prevents heart disease had been overturned, half the scientific papers mentioning it cast it as true, Ioannidis found in 2007.
The situation isn’t hopeless. Geneticists have mostly mended their ways, tightening statistical criteria, but other fields still need to clean house, Ioannidis says. Surgical practices, for instance, have not been tested to nearly the extent that medications have. “I wouldn’t be surprised if a large proportion of surgical practice is based on thin air, and [claims for effectiveness] would evaporate if we studied them closely,” Ioannidis says.
That would also save billions of dollars. George Lundberg, former editor of The Journal of the American Medical Association, estimates that strictly applying criteria like Ioannidis pushes would save $700 billion to $1 trillion a year in U.S. health-care spending.
Of course, not all conventional health wisdom is wrong. Smoking kills, being morbidly obese or severely underweight makes you more likely to die before your time, processed meat raises the risk of some cancers, and controlling blood pressure reduces the risk of stroke.
The upshot for consumers: medical wisdom that has stood the test of time—and large, randomized, controlled trials—is more likely to be right than the latest news flash about a single food or drug.
Stress blocker helps bald mice regrow hair
If this generalizes to humans, the discoverers of the compound should make a fortune
US researchers looking at how stress affects the gut stumbled upon a potent chemical that caused mice to regrow hair by blocking a stress-related hormone, said a study today.
While the process has not yet been tested in humans, it grew more hair in mice than minoxidil, the ingredient in Rogaine, a popular treatment for baldness, said the study in the online journal PLoS One.
"This could open new venues to treat hair loss in humans through the modulation of the stress hormone receptors, particularly hair loss related to chronic stress and aging," said co-author Million Mulugeta.
Researchers from University of California at Los Angeles and the Veterans Administration discovered the chemical compound "entirely by accident", said the study. Scientists were using genetically engineered mutant mice that were altered to produce too much of a stress hormone called corticotrophin-releasing factor, or CRF. The chronic stress condition makes them lose hair on their backs.
They injected a chemical compound called astressin-B, developed by the California-based Salk Institute, into the mice to see how the CRF-blocker would affect gastrointestinal function. When they saw no effect at first, they continued for five days. The researchers completed their gastrointestinal tests and put the mice back in cages with their hairier counterparts.
When they returned to get the stressed-out mice three months later for more tests, they discovered they could no longer tell them apart because the mice had regrown all the hair they had lost.
"Our findings show that a short-duration treatment with this compound causes an astounding long-term hair regrowth in chronically stressed mutant mice," said Professor Mulugeta of the David Geffen School of Medicine at UCLA.
Not only did it help grow hair, it also appeared to help hair maintain its color and not go grey. "This molecule also keeps the hair color, prevents the hair from turning gray," he said.
The short five-day time span of treatments brought hair growth effects that lasted up to four months, which was also surprising to researchers. "This is a comparatively long time, considering that mice's life span is less than two years," Professor Mulugeta said.
Researchers gave the bald mice treatments of "minoxidil alone, which resulted in mild hair growth, as it does in humans. This suggests that astressin-B could also translate for use in human hair growth," said the study.
Co-author Yvette Tache, a professor of medicine at UCLA, said it could take up to five years to start a clinical trial in humans. "This research could be beneficial in a lot of diseases which are stress-related in their manifestations or exacerbation of symptoms," she said, noting that no sign of toxicity has appeared after extensive tests on mice.
Professor Mulugeta said talks were underway with a major cosmetics firm to fund a study in humans. "In general, the concept that interfering with the stress hormones or their receptors to prevent or treat stress-related diseases is a very valid concept," he said.
Jean Rivier, a Swiss professor at the Salk Institute, said he has reason to believe the process could be useful in other stress-related ailments, from psoriasis to depression to heart and artery problems. "You bring back the skin to a normal acidic state where it can essentially go back to a normal stage whereby hair will grow again, and this is very reminiscent of what is seen in relapses and remissions," he said.
The CRF-blocker was studied by other major pharmaceutical companies but was abandoned because it could not administered orally, he said.
However, the latest method uses peptide analogs that bind to the same receptors but via different types of molecules, which could be administered though injections or potentially through a nasal spray.
17 February, 2011
Michelle Obama: First Lady of Junk Science
While her husband may have paid lip service to ending the abuse of science for "politics or ideology," first lady Michelle Obama gave herself a super-sized waiver. Two of her showcase social engineering campaigns -- tax preferences for breast-pumping working mothers and expanded nutrition labels -- are based on distorting or dismissing the prevailing public health literature.
Just as the White House costumed Obamacare activists in white lab coats, the fashionable Mrs. O has cloaked her meddling anti-obesity crusade in medical fakery.
Over the past year, the first lady has marshaled a taxpayer-subsidized army of government lawyers, bureaucrats and consultants against the "national security threat" of childhood obesity. She has transformed the East Wing of the White House into Big Nanny's new Central Command headquarters. The biggest threats to Mrs. Obama's 70-point plan for national fitness: parental authority and sound science.
As part of her "Let's Move!" anniversary celebration this week, Mrs. Obama rolled out a new breastfeeding initiative because "kids who are breastfed longer have a lower tendency to be obese." She made her assertion to an invitation-only group of handpicked reporters who were barred from asking questions about her scientific conclusions. It's not healthy to challenge Super Nanny, you see.
After the Internal Revenue Service carefully studied and rejected an advocacy push to treat nursing equipment as a tax-deductible medical expense last fall, the tax agency suddenly reversed itself in time for the first lady's new public relations tour. The surgeon general has also issued a "Call to Action" to pressure private businesses to adopt more nursing-friendly environments to combat childhood obesity, all while denying that government is intruding on personal decisions. "No mother should be made to feel guilty if she cannot or chooses not to breastfeed," Surgeon General Regina Benjamin asserted, while laying an unmistakable guilt trip on moms and moms-to-be.
So, what do studies on breastfeeding and babies' weight actually say? Rebecca Goldin, Ph.D., research director of George Mason University's Statistical Assessment Service, points out that the literature is inconclusive or demonstrates that the health advantages of bosom over bottle are short-lived:
"Indeed, there is little evidence that using formula causes obesity. There is a correlation between formula use and obesity among babies and children ... though this correlation is not consistent in all studies. Some of these studies show a relationship in only some demographics and not others. Others show that the disadvantage of bottle-feeding and/or formula mostly goes away by the time a child is about 4 years old.
"The result is that we cannot discover whether breastfeeding is correlated with obesity because infant formula or bottle feeding leads to subsequent overeating or disposition to being overweight, or whether those parents who breastfeed are also more likely to offer their children green beans instead of French fries. Despite weak evidence, there is a lingering conviction that formula causes obesity among pediatricians and the press; if anything, the study about infants should make us reflect more carefully on this conclusion."
Alas, such nuance from Mrs. Obama and her unquestioning media water-carriers is scarcer than tofu at Taco Bell.
Don't get me wrong. As a proud mom who breastfed both of her babies, I've been and will always be a vocal defender of women who have devoted the time, dedication and selflessness it takes. But there are myriad individual reasons beyond Mrs. O's expansive goal of battling the collective scourge of childhood obesity -- intimate bonding and health benefits for the mom, not just the baby, for example -- that lead women to nurse.
And we don't need Big Brother or Big Mother to lead the Charge of the Big Bosom to persuade us of the personal benefits. Many private hospitals and companies have already adopted nursing-friendly environments. If it's as good for their bottom lines as it is for babies' bottoms, they don't need a government mandate to do the right thing.
But as I've noted many times over the past year, Mrs. O's real interest isn't in nurturing nursing moms or slimming down kids' waistlines. It's in boosting government and public union payrolls, along with beefing up FCC and FTC regulators' duties.
Take another East Wing pet project: leaning on private businesses to print expanded front-package nutrition labels warning consumers about salt, fat and sugar. The first lady's anti-fat brigade assumes as an article of faith that her top-down designer food labels will encourage healthier eating habits. It's a "no-brainer," Mrs. Obama insists.
However, the latest study on this very subject -- funded by no less than the left-wing Robert Wood Johnson Foundation -- confirms other recent research contradicting the East Wing push. A team led by Duke-NUS Graduate Medical School's Eric Finkelstein, published in the peer-reviewed American Journal for Preventive Medicine, found that mandatory menu-labeling in Seattle restaurants did not affect consumers' calorie consumption. "Given the results of prior studies, we had expected the results to be small," the researchers reported, "but we were surprised that we could not detect even the slightest hint of changes in purchasing behavior as a result of the legislation."
Will the first lady and her food cops be chastened by the science that undermines their spin? Fat chance.
GM food hysteria
By Mark Tester, a professor of plant physiology
The bigger cities grow, the more insular they become. This truism is ever so apparent in the recent rekindling of debate about the production of genetically modified foods and crops.
Urban communities are becoming so disconnected with how food is actually produced that conventional farming faces growing problems of public perception and trust.
This is not helped by the constant, ideologically driven doubt-mongering about GM technology by professional activists, such as Greenpeace, which undermines public confidence in the science that underpins our modern, efficient and sustainable food production system.
We are told GM technology is unnecessary, yet the Food and Agriculture Organisation of the United Nations estimates the number of humans on the planet will rise from 6 billion in 2000 to nearly 9 billion in 2050, and that food demand will rise by 70 per cent. Given historical increases in food production, it is improbable that farming systems based on conventional science and organics would be able to supply this increase in demand. There are more hungry people now than at any time in history.
GM science will be an essential tool for food security in the decades ahead. In fact, it already is. In 2009, 134 million hectares of crops bred with the aid of the technology were planted in 26 countries. It is estimated that GM varieties of corn, soybean, cotton and canola have delivered tens of millions of tonnes of extra food and fibre since 1996.
As for the contention that agri-corporations will ''control seed supplies'', farmers appreciate that research and development businesses need a return on their long-term investments. Most modern plant breeding is now done by public-private collaborations, and patents and royalties are required to fund the work. It is the global mechanism to provide incentive to innovate. Farmers make economic decisions to use GM; they do not have to use it and can equally use non-GM seeds.
Eco-alarmist opponents of GM technology repeatedly refer to studies that purport to have discovered something harmful about its use. But such studies have, without exception, been discredited by the weight of mainstream scientific evidence, opinion and peer review, and by recognised independent regulatory agencies around the world. The truth is that approved GM varieties are safe for human health and the environment; they are subjected to far more intense, transparent and accountable analysis than conventionally bred varieties.
Australia's world-class regulatory system is designed to pick up anomalies and look for potential problems. Safety is the first priority. Why would it be anything other than that?
The proof of the safety of approved GM varieties is, literally, in the pudding: billions of meals containing GM ingredients have been consumed. In our stomach, all the proteins, starches, fats and oils that are in lettuce, carrots, potatoes, pumpkin, tomatoes, corn, soybeans, canola, dairy products, beef, lamb, chicken, fish and shellfish are all broken down into the basic biochemical amino-acid building blocks, and no genetic material becomes incorporated into our genes.
While city people are urged to stop the use of GM canola, our farmers are heading in a different direction. In only the third year of its commercial production in Australia, hundreds of our farmers chose to grow nearly 133,300 hectares of GM canola in NSW, Victoria and WA last year - nearly 12 per cent of the total canola crop.
Incidentally, more than 90 per cent of Australia's cotton crop is grown from GM varieties. City folk should compare the hullabaloo over GM canola to the non-issue of GM cotton. Both crops use Roundup-Ready technology, both crops use few agricultural chemicals and both crops produce edible oils for cooking and meal for livestock supplements. So why the fuss about canola?
What will the activists decry when more GM varieties come to market? Australian researchers are using GM to help develop papaya, pineapple, sugar cane, grape vines, carnations, chrysanthemums, rice, white clover, wheat, Indian mustard, banana, barley, perennial ryegrass, tall fescue, corn and rose varieties with new traits that reduce production risks and underpin yield.
It is a good thing that people have a view about how their food is produced. It is best to have an informed view.
16 February, 2011
Why calorie counts don't reduce eating (because taste and price are more important)
Calorie counts on menus do not make diners eat less, research suggests. Taste, price and location were all more important.
The research was done in New York, where the display of calorie counts on menus in fast food restaurants is mandatory, but the study raises questions about the value of similar voluntary schemes in the UK.
In the study, health policy experts from New York University stopped hundreds of children and teenagers as they left fast food restaurants, before and after it became compulsory to put calorie counts on menus. The youngsters were interviewed about what they had eaten and why they had chosen that particular restaurant.
After labelling was brought in, more than half of those interviewed said they had noticed the calorie information. However, few paid any attention to it – with just 16 per cent saying it influenced their choices.
Despite what they said, displaying calorie counts had little or no effect on the amount of food eaten, the International Journal of Obesity reports. Indeed most underestimated the amount of calories they’d consumed, with many taking in almost 500 more than they realised – the equivalent of a Big Mac.
Researcher Dr Brian Elbel said his findings were all the more important because they came from a ‘real world’ setting. But he also admitted that was carried out on a relatively small number of children and teenagers, and adults tend to be more health-conscious.
Take zinc to fight a cold
Cochrane studies are unusually careful so this looks convincing. One hopes that experimenter expectation effects have been thoroughly excluded, however. The fact that studies continued after initial negative results is troubling
The best way to shake off a cold is to take supplements of the trace metal zinc, scientists say today.
While zinc is perhaps best known for protecting cars against rust, in minute quantities it has a host of important physiological functions.
Now a review of 15 clinical trials published since 1984 has concluded that taking supplements can reduce the length of a cold and help ward one off in the first place.
The conclusions of the Cochrane Collaboration, an Oxford-based not-for-profit institution that reviews existing studies to spot trends missed by looking at them individually, could lead in time to zinc replacing vitamin C as the cold 'cure' of choice.
The latest Cochrane Review found that people who took a zinc syrup solution or lozenge every two hours while they had a cold were twice as likely to have shed it within a week as those who took a placebo. Children who took a zinc tablet once a day for at least five months were also a third less likely to get colds as those who took a placebo.
The scientists concluded: "Evidence shows that zinc is beneficial for the common cold in healthy children and adults living in high-income countries. "Pooled results from the trials showed that zinc reduced the duration and severity of common cold symptoms when used therapeutically. "Zinc also reduced the incidence of the common cold, school absence and antibiotic use in healthy children."
The last Cochrane Review on zinc, in 1999, found "no strong evidence" that it had any positive effect on colds, but studies since then have forced a reappraisal.
However, the scientists cautioned that they did not yet know what dose was best, and said some zinc formulations had side effects including nausea, bad taste and diarrhoea. Because of these problems, they said it was "difficult to make firm recommendations about the dose, formulation and duration that should be used".
In the studies using zinc as a therapeutic medicine, doses ranged from 30 to 160mg per day. In the two preventive trials, one used a 15mg daily dose of zinc sulphate solution in syrup, while the other used a 10mg dose of zinc sulphate in a tablet.
Tablets containing 10, 15 or 25mg of zinc compound can easily be bought in health food shops and pharmacies. The recommended daily amount is 15mg.
Four years ago the Cochrane Collaboration concluded that taking Vitamin C supplements had no effect on most people's ability to avoid colds, although it did for particular groups under extreme physical stress.
15 February, 2011
Are you a health food junkie?
The obsession to cut out "bad" foods and don a flawless physique can lead to dangerous eating patterns.
We all know the type. They never let wheat, yeast or dairy pass their lips. They've cut out alcohol and caffeine. They're obsessed with healthy eating — yet every day, they look more unwell and unhappier.
These are the symptoms of a condition called 'orthorexia' by dieticians. It is, apparently, on the increase — particularly in professional women in their 30s.
Orthorexia was coined in 1997 by Californian doctor Steven Bratman in his book Health Food Junkies, and means 'correct appetite' (from the Greek orthos for right and orexis for appetite). It is a fixation with eating 'pure' food that, far from doing you good, can become so extreme that it leads to malnutrition, chronic ill health and depression.
Plenty of celebrities are secret long-term orthorexics, passing off their limited diet of sashimi or steamed broccoli as 'getting in shape for a part'.
But they're not the only ones. Many of us have fallen into the same trap, believing that the more 'bad' foods we cut out, the healthier we'll be. But it's the start of a slippery slope.
And it doesn't just stop at food - orthorexics are often gym bunnies, who'll work out for two hours and then go for a ten-mile run.
The grim truth is that this level of health obsession is a potentially dangerous form of self-control. And it's increasingly prevalent.
"Women are much more likely today to become exercise and diet-addicted because of our celebrity-obsessed culture and the pressure to be slim," says Lucy Jones of the British Dietetic Association.
"While this condition is not as dangerous as anorexia, any obsession that cuts out entire food groups can lead to long-term health damage such as a lack of bone density, heart attacks, strokes and diabetes.
It's more difficult to spot than anorexia or bulimia because sufferers can simply insist that they're 'look-ing after themselves', or 'have a wheat intolerance'.
But when the desire to be healthy moves from avoidance of junk food to a fear of perfectly healthy food groups such as dairy, carbs or wheat, it's a warning sign of orthorexia.
More on the dangerous side-effects of statins
Dr. Graveline has an interesting background that makes him particularly suited to speak on the topic of statin drugs. He's a medical doctor with 23 years of experience whose health was seriously damaged by a statin drug. His personal questions brought him out of retirement to investigate statins, which he's been doing for the past 10 years.
As a former astronaut, he would get annual physicals at the Johnson Space Center in Houston. In 1999 his cholesterol hit 280 and he was given a prescription for Lipitor.
"When they suggested Lipitor (10 mg), I went along with it because I had no reason to be particularly worried about statin drugs," he says. "I had used it a year or so before my retirement, but I wasn't a big user."
However, it quickly became apparent that something was seriously wrong.
"It was six weeks later when I experienced my first episode of what was later diagnosed as transient global amnesia," Dr. Graveline says.
"This is an unusual form of amnesia wherein you immediately, without the slightest warning, are unable to formulate new memory and you can no longer communicate. Not because you cannot talk, but you can't remember the last syllable that was spoken to you. So nothing you say is relevant anymore. In addition, you have a retrograde loss of memory, sometimes decades into the past."
He "woke up" about six hours later in the office of a neurologist, who gave him the diagnosis: transient global amnesia. He quit taking the Lipitor despite the reassurances from his doctors that the drug was not of concern, and that it was just a coincidence.
He had no relapses during the remainder of the year, but his cholesterol was still around 280 at his next physical. He was again urged to take Lipitor, and he relented.
"I admit I was concerned, but I had talked to maybe 30 doctors and a few pharmacologists during the interval," Dr. Graveline says. "They all said "statins don't do that." So I allowed myself to go back on statins but this time I took just 5 mg.
…[E]ight weeks later, I had my second, and my worst episode. In this one, I was a 13-year-old high school student for 12 hours... This is what convinced me, when I finally woke up, that something was wrong with the statin drugs. And yet, the doctors were, for years after that, still saying that this was just a remarkable coincidence.
This took me out of retirement and I've been actively involved in researching statin drugs ever since."
Statin Drugs: Not Nearly as Safe as You're Told
Dr. Graveline has since published a book about his discoveries called Lipitor: Thief of Memory.
"In trying to reach an explanation, I called Joe Graedon and asked him if he had ever heard of any unusual reactions associated with statins," Dr. Graveline says of his initial investigations.
He was directed to the statin effects study by Beatrice Golomb in San Diego, California, and his story was also published in a syndicated newspaper column. Within weeks, the web site he had created received reports of 22 cases of transient global amnesia, along with hundreds of cases of cognitive damage. At present, over 2,000 cases of transient global amnesia associated with the use of statins have been reported to FDA's MedWatch.
But cognitive problems are not the only harmful aspect of these drugs. Other serious adverse reactions include:
Personality changes / mood disorders
Muscle problems, polyneuropathy (nerve damage in the hands and feet), and rhabdomyolysis (a serious degenerative muscle tissue condition)
Pancreas or liver dysfunction, including a potential increase in liver enzymes
According to Dr. Graveline, a form of Lou Gehrig's disease or ALS may also be a side effect, although the US FDA is resistant to accept the link found by their Swedish counterpart, and has so far refused to issue a warning.
"The World Health Organization (WHO) reported on this in July 2007 when Ralph Edwards, who directs the Vigibase in Sweden (the equivalent of the US MedWatch), reported ALS-like conditions in statin users worldwide," Dr. Graveline says.
He has since forwarded hundreds of cases to MedWatch, but the FDA still has not been moved to act, and doctors are therefore unaware of the connection between this deadly disease and statin use.
"[W]e have anecdotal evidence that if you stop the statin drug early enough, some of these cases regress. That's why we thought it was important that FDA issue a warning, but they haven't," Dr. Graveline says.
Today, all of these adverse effects, including the cognitive problems Dr. Graveline warned about 10 years ago, are supported by published research. MedWatch has received about 80,000 reports of adverse events related to statin drugs, and remember, only an estimated one to 10 percent of side effects are ever reported, so the true scope of statins' adverse effects are still greatly underestimated.
14 February, 2011
Another backflip: Forget those high cholesterol warnings, eggs are healthier than ever, say experts
If you're eyeing up your breakfast options and fancy going to work on an egg, there’s no need to hold back. For after years of telling us to shun them as an everyday food, the health police now say that eggs have become better for us. The cholesterol content of eggs – which was previously believed to be a health risk – is now much lower compared with ten years ago, a study suggests.
Eggs also contain more vitamin D, which helps protect the bones, preventing diseases such as osteoporosis and rickets.
The reason eggs have become more nutritious over the past decade is that hens are no longer fed bone meal, which was banned in the Nineties following the BSE crisis, the researchers claim. Instead the birds are normally given a mixture of wheat, corn and high-protein formulated feed, which makes their eggs more wholesome.
A U.S. government study found that modern eggs contain 13 per cent less cholesterol and 64 per cent more vitamin D compared with a decade ago. This is backed by British research which shows that a medium-sized egg contains about 100mg of cholesterol, a third of the 300mg recommended daily limit.
Andrew Joret, deputy chairman of the British Egg Industry Council, whose firm Noble Foods made the findings, said: ‘We believe the reduction is due to changes in the feeds used in British plants since the Nineties when the use of bone meal was banned.’
Two years ago Canadian researchers claimed that eggs actually helped lower blood pressure. They suggested that when eggs are digested they produce proteins that mimic the action of powerful blood pressure-lowering drugs, known as Ace inhibitors.
A recent Surrey University study found eating one or two eggs for breakfast could help with weight loss as the high protein content makes us feel fuller longer. The study, which involved volunteers eating two eggs a day for 12 weeks, also found that none had raised cholesterol.
In the Sixties many Britons ate up to five eggs a day but by the Nineties this had dropped to two or three a week – in part due to warnings about high cholesterol levels.
Charles Saatchi, husband of TV chef Nigella Lawson, recently claimed to have lost five stone by eating eggs for breakfast, lunch and dinner.
Another step along the road towards medicalizing all problems: Grief now a mental illness
By that standard, the much-respected Queen Victoria was as nutty as a fruitcake
A PUSH to classify grief as a psychological disorder has been criticised by experts as "disease-mongering". Prolonged grief disorder (PGD) will be included in the next edition of the Diagnostic Statistical Manual of Mental Disorders, the psychiatrists' bible used to diagnose mental problems.
Bereavement, a normal part of life, has been excluded from previous editions of the manual, but Australian Medical Association spokesman and psychiatrist Tom Stanley says there is significant evidence of PGD. "Occasionally, normal grief will become pathological and, in many cases, it will precipitate severe depression," Dr Stanley said.
Prolonged grief disorder was originally identified by US psychiatrists, but counsellor Mal McKissock, of Sydney's Bereavement Centre, said: "It's nothing more than disease-mongering. "My colleagues in the US get reimbursed only if there's a real sickness, so they created one."
Mr McKissock is concerned about the growing number of patients he sees who have been medicated with anti-depressants. "I'd venture that 50 per cent of the people I see are on anti-depressants - and that includes children, which is outrageous," he said.
"People think they have a disease. They think they're depressed, but they are sad, passionately sad, and it's a natural process."
Former Australian of the Year and campaigner for mental health Professor Pat McGorry said there was a distinction between normal grief and prolonged grief that could lead to severe depression.
13 February, 2011
Another labelling fail
Customers overlooking unit pricing
Shoppers are spending more at the checkouts because they are not using unit prices to work out the cheapest brands. Research shows that fewer than 50 per cent of shoppers overseas look at the unit price when buying groceries and experts say the take-up rate in Australia is as low. Unit pricing was introduced here just over a year ago.
[I'd like to see the research behind that 50%. I see lots of people in my local supermarket loading up their carts but it's a rarity to see them stop and read a label. What people say they do and what they actually do can be two different things.]
Gary Mortimer, from QUT's School of Advertising, Marketing and Public Relations, said that with so much information on labels, like price, barcode, size and description, shoppers could be overwhelmed, or simply not have the time to do the maths.
Ian Jarrett, of the Queensland Consumers Association, which has been lobbying for unit prices to be printed in the same type size recommended for packaging labels, said there were big savings to be made.
His survey of prices at one supermarket came up with a trolley full of savings. For example, sultanas in six small packs cost $10.38 a kilo but only $3.79 a kilo if bought in a 1kg bag. A tub of Meadowlea margarine cost 50c for each 100g if bought in a 1kg tub, 70c for each 100g if bought in a 500g tub or 78c in a 250g tub.
But Dr Mortimer said there was a need for caution. "A larger packet may be cheaper per gram than a smaller packet but if you have to pay a higher retail price to start with, it becomes a false economy if you end up wasting half the contents of the larger packet because you simply cannot use it all."
He said it was also important to remember unit pricing did not take into consideration such things as different plies of toilet paper and different concentrations of products such as washing detergents and cordial. "Unit pricing does not capture different densities, concentrations and strengths and these need to be taken into consideration," he said.
Mr Mortimer said different demographics were attracted to unit pricing. "There are some shoppers who think 'I am pretty well paid, I don't need to save money'," he said. "And there are some Gen-Y shoppers, and particularly male shoppers, who think 'I want to get in and get out. I don't have time to look at the different numbers on the ticket'.
"But if you are a mum with three kids from a working-class environment and need to buy 800g or a kilo for lunches for the kids and dad this week and you can wait to be served, you can make a saving."
Speculate, Speculate, Speculate
Medical researchers seem to do little else. The claim below that sunshine lowers MS risk is totally naive. What their results more likely show is that people in poor health don't go out much
The Australian love affair with the great outdoors may have contributed to lower rates of multiple sclerosis (MS), according to research from The Australian National University.
The Ausimmune Study, coordinated by Associate Professor Robyn Lucas from the ANU College of Medicine, Biology and Environment and involving researchers from across Australia, found that people who spend more time in the sun, and those with higher vitamin D levels, may be less likely to develop MS.
MS is a chronic disease of the brain and spinal cord and has long baffled researchers, who continue to search for its cause and cure. This study, published in the February 8 2011, issue of Neurology, the medical journal of the American Academy of Neurology, takes us one step closer to understanding the risk factors that may lead to MS.
Associate Professor Lucas said that many people who experience preliminary symptoms of the sort that occur in MS – known as a ‘first event’ – go on to develop the disease. The Ausimmune Study found that the risk of having a first event was lower in people with higher sun exposure – over the whole of their lives as well as in the months preceding the event, compared with unaffected people of the same age and sex and living in the same region of Australia.
“People with the highest levels of vitamin D were also less likely to have a diagnosed first event than people with the lowest levels,” she said.
The study is the first to look at sun exposure and vitamin D status in people who had experienced a first event with the type of symptoms found in MS.
“Previous studies have looked at people who already have MS,” said Dr Lucas. “This has made it difficult to know whether having the disease led them to change their habits in the sun or their diet. That is, it has not been possible to work out if low sun exposure or vitamin D cause the disease or were caused by having the disease.”
Associate Professor Lucas said that the study showed, for the first time in a human population, that the effects of sun exposure and vitamin D act independently of each other, with each having a beneficial effect in decreasing the risk of a first event.
“Further research should evaluate both sun exposure and vitamin D for the prevention of MS,” she said.
12 February, 2011
A Toxic Bureaucracy
Sam Kazman's "Drug Approvals and Deadly Delays" article in the Journal of American Physicians and Surgeons (Winter 2010), tells a story about how the U.S. Food and Drug Administration's policies have led to the deaths of tens of thousands of Americans. Let's look at how it happens.
During the FDA's drug approval process, it confronts the possibility of two errors. If the FDA approves a drug that turns out to have unanticipated, dangerous side effects, people will suffer. Similarly, if the FDA denies or delays the marketing of a perfectly safe and beneficial drug, people will also suffer. Both errors cause medical harm.
Kazman argues that from a political point of view, there's a huge difference between the errors. People who are injured by incorrectly approved drugs will know that they are victims of FDA mistakes. Their suffering makes headlines. FDA officials face unfavorable publicity and perhaps congressional hearings.
It's an entirely different story for victims of incorrect FDA drug delays or denials. These victims are people who are prevented access to drugs that could have helped them. Their suffering or death is seen as reflecting the state of medicine rather than the status of an FDA drug application. Their doctor simply tells them there's nothing more that can be done to help them.
Beta-blockers reduce the risks of secondary heart attacks and were widely used in Europe during the mid-'70s. The FDA imposed a moratorium on beta-blocker approvals in the U.S. because of the drug's carcinogenicity in animals. Finally, in 1981, FDA approved the first such drug, boasting that it might save up to 17,000 lives per year. That meant as many as 100,000 people might have died from secondary heart attacks waiting for FDA approval.
In the early 1990s, it took the FDA more than three years to approve interleukin-2 as the first therapy for advanced kidney cancer. By the time the FDA approved the drug, it was available in nine European countries. The FDA was worried about the drug's toxicity that resulted in the death of 5 percent of those who took it during testing trials. This concern obscures the fact that metastatic kidney cancer has the effect of killing 100 percent of its victims.
Kazman says that if we estimate that interleukin-2 would have helped 10 percent of those who would otherwise die of kidney cancer, then the FDA's delay might have contributed to the premature deaths of 3,000 people. Kazman asks whether we've seen any photos or news stories of the 3,000 victims of the FDA's interleukin-2 delay or the 100,000 victims of the FDA's beta-blocker delay.
These are the invisible victims of FDA policy. In the 1974 words of FDA commissioner Alexander M. Schmidt: "In all of FDA's history, I am unable to find a single instance where a congressional committee investigated the failure of FDA to approve a new drug. But, the times when hearings have been held to criticize our approval of new drugs have been so frequent that we aren't able to count them. ... The message to FDA staff could not be clearer."
That message is to always err on the side of overcaution where FDA's victims are invisible and the agency is held blameless.
Kazman's day job is general counsel for the Washington, D.C.-based Competitive Enterprise Institute that's done surveys of physicians and their views of the FDA. On approval speed, 61 to 77 percent of physicians surveyed say the FDA approval process is too slow. Seventy-eight percent believe the FDA has hurt their ability to give patients the best care.
But so what? Physicians carry far less weight with the FDA than "public interest" advocates and politicians.
When the FDA announces its approval of a new drug or device, the question that needs to be asked is: If this drug will start saving lives tomorrow, how many people died yesterday waiting for the FDA to act?
Breakthrough treatment for AMD blindness trialled
A new treatment for the most common form of blindness in Britain has been developed by scientists. Researchers discovered that sufferers of AMD (dry age related macular degeneration) have a lack of protective enzyme in the eye that could be why it leads to reduction in eye sight. Drugs that boost this chemical in the eye are to be tested on patients within a year, it was reported in the journal Nature.
More than 500,000 people in Britain suffer from AMD which is caused by the deterioration and death of the cells in the macula, a part of the retina used to see straight ahead. The disease robs sufferers of their sight by creating a black spot in the centre of their vision which slowly gets bigger.
With numbers of AMD sufferers expected to treble in the next 25 years as the population ages, there is an urgent need for a breakthrough.
Scientists at the University of Kentucky found the activity of the enzyme called DICER1 was dramatically reduced in the retina of human donor eyes with the condition. Laboratory experiments on mice also showed this reduced expression was associated with a build-up of a poisonous molecule called Alu RNA that kills cells in the retina.
Dr Jayakrishna Ambati, of the University of Kentucky, and colleagues found DICER1 can shield the retina against the destructive effects of Alu RNA. "When the levels of Dicer decline, the control system is short-circuited and too much Alu RNA accumulates. "This leads to death of the retina."
In developed countries it is estimated that one in fifty people over fifty years of age, and up to one in five people over the age of 85, have AMD.
Currently, there is no reliably proven medical treatment for dry AMD. But not smoking and eating a healthy diet may help to slow the rate of deterioration.
Dr Ambati has created two treatments that could potentially halt the march of the disease. One works by boosting levels of Dicer, the other breaks down the toxic Alu RNA. The University of Kentucky has applied to patent the techniques and the first trials on people could start by the end of this year.
11 February, 2011
Restaurant nutrition under attack from Mrs. Obama
What makes Mrs Obama an authority on nutrition? Does going to bed with the President give an automatic transfusion of wisdom?
After wrapping her arms around the retail giant Wal-Mart and trying to cajole food makers into producing nutrition labels that are easier to understand, Michelle Obama, the first lady and a healthy-eating advocate, has her sights set on a new target: the nation’s restaurants.
A team of advisers to Mrs. Obama has been holding private talks over the past year with the National Restaurant Association, a trade group, in a bid to get restaurants to adopt her goals of smaller portions and children’s meals that include healthy offerings like carrots, apple slices and milk instead of French fries and soda, according to White House and industry officials.
The discussions are preliminary, and participants say they are nowhere near an agreement like the one Mrs. Obama announced recently with Wal-Mart to lower prices on fruits and vegetables and to reduce the amount of fat, sugar and salt in its foods. But they reveal how assertively she is working to prod the industry to sign on to her agenda.
On Tuesday, Mrs. Obama will begin a three-day publicity blitz to spotlight “Let’s Move!,” her campaign to reduce childhood obesity, which was announced one year ago this week.
She will introduce a public service announcement, appear on the “Today” show and deliver a speech in Atlanta promoting gardening and healthy-eating programs.
But as she uses her public platform to persuade children to eat healthier and exercise more, Mrs. Obama and her team are also quietly pressing the levers of industry and government. Over the past year she has become involved in many aspects of the nation’s dietary habits, exerting her influence over nutrition policy.
Her team has worked with beverage makers to design soda cans with calorie counts and is deeply involved in a major remake of the government’s most recognizable tool for delivering its healthy-eating message: the food pyramid.
Mrs. Obama persuaded Congress to require schools to include more fruits and vegetables in the lunches they offer, and she encouraged lawmakers to require restaurants to print nutrition information on menus, a provision that wound up in President Obama’s landmark health care law.
“They really want a cooperative relationship with the food industry, and they’re looking at industry to come up with ideas,” said Lanette R. Kovachi, corporate dietitian for Subway, the nation’s second-largest restaurant chain in terms of revenue. She said she had taken part in at least four conference calls with Mrs. Obama’s food advisers.
But in seeking partnerships with industry, Mrs. Obama runs a risk. While nutritionists and public health advocates give her high marks for putting healthy eating on the national agenda, many worry that she will be co-opted by companies rushing to embrace her without offering meaningful change.
“Can the food industry play a responsible role in the obesity epidemic? The answer isn’t no,” said Dr. David Ludwig, the director of the Optimal Weight for Life program at Children’s Hospital in Boston. “The point is that the best initiatives can be subverted for special interest, and it’s important to be vigilant when we form partnerships with industry.”
White House officials say Mrs. Obama has believed from the start that bringing industry to the negotiating table is critical to achieving her long-range goal of eliminating childhood obesity within a generation.
Why it's known as reefer madness
I haven't had time enough to track down the research reports underlying the summary below but I note an apparent admission (in red) that pot use is highest at the bottom end of the social class scale. That leaves open the possibility that all the associations with pot use are in fact associations with social class only
CANNABIS is widely used for its perceived positive effects, yet there is a growing scientific consensus that a range of health and social harms are associated with its use, including educational underachievement, higher school dropout rates, impaired driving ability, the abuse of other illicit drugs and the early onset of some mental illnesses. Teenagers are particularly vulnerable.
Now an international study, released this morning, has given us some of the strongest evidence to date of the risks of cannabis on long-term mental health.
Cannabis polarises public opinion like few other substances; some see it as a benign, chilled out alternative to alcohol while others fear it as an illegal gateway drug to heroin addiction.
It is by far the most commonly used illicit drug, with one third of the population reporting that they used cannabis at some time during their lives, including 14 per cent of 12- to 17-year-olds. Almost 10 per cent of Australians used the drug in the past year. But despite varying policies of decriminalisation in many states and territories, it is still illegal to use, possess, grow or sell cannabis in Australia.
No approach to preventing cannabis harm will ever be without controversy, but the best place to start is with public awareness of the facts; those facts drawn from thorough, evidence-based research.
The research, by Matthew Large and colleagues, used improved statistical techniques to find that cannabis use hastens the onset of psychosis among young people by up to 2.7 years, often bringing mental illness forward to coincide with the critical years of adolescent brain development. Interestingly, alcohol use did not have any effect on age of onset.
The study was also able to show that known influences such as gender or age differences between the samples of cannabis and non-cannabis users with psychosis were not responsible for this difference.
The importance of adolescence and early adulthood for brain development has only recently been recognised. This period is characterised by rapid changes in brain growth and connectivity, especially in those parts of the brain responsible for cognitive and social processes. There is evidence that cannabis can disrupt this brain development, which may explain the mechanism responsible for the earlier age at onset of psychosis among adolescent cannabis users. Studies of a particular gene and environmental interaction, that has a critical window in adolescence, are also helping to explain the mechanism behind this finding.
A number of studies have now followed individuals from birth or very early adolescence into adulthood. Any cannabis use increases the risk of experiencing psychosis by about 40 per cent and regular use doubles this risk. Using cannabis in early adolescence increases the risk even further. The earlier onset of psychosis is also associated with a poorer lifelong prognosis, as is continued cannabis use.
A lesser recognised consequence is the effect of cannabis on educational attainment. A recent study combining three large Australian and New Zealand studies found high school dropout rates would fall by 17 per cent with no cannabis use. Possible reasons for this robust association may be that the early use of cannabis sets in train biological, individual or social processes which undermine motivation, learning or commitment to school, independent of other influences on attitudes to education.
These findings combine to produce further evidence that avoiding cannabis use, particularly in adolescence, can both significantly delay the onset of psychosis and improve educational outcomes.
Public awareness of the significant risks associated with adolescent cannabis use is critical in reducing the burdens of psychotic disorders and underachievement, and their costs to individuals, families and communities.
Happily, Australia has experienced a decline in overall cannabis use since a peak in 1998. The rates of regular and heavy cannabis use among marginalised members of the community, however, are unchanged or increasing.
This means public awareness campaigns and support services should urgently prioritise young people experiencing socioeconomic disadvantage or within the criminal justice system, Aboriginal and Torres Strait Islanders, and people with a family history of mental illness; groups that are already vulnerable.
To even begin to answer the most fundamental questions about how to best to communicate this information and, more importantly, how to influence behaviour, the real risks of use need to be widely understood. This latest research sends us a clear message about avoiding use, particularly for teens.
Like alcohol, which is also widely used at considerable personal and public cost, cannabis, too, is an urgent, public health priority.
10 February, 2011
Smoking harms mental health but quitters arrest decline, study finds
This study seems methodologically sound
SMOKING accelerates mental decline and damages parts of the brain linked to dementia, an Australian study has found. But there is good news for long-term smokers: quitting reverses the harmful effects on the brain.
The study assessed brain function using standard performance tests, matching the results to brain scans in 229 elderly smokers who were trying to give up and 98 non-smokers.
The research, repeated at six-monthly intervals for two years, was the first in the world to track changes in smokers' mental performance over a lengthy period.
It found the smokers, who were aged 68 and over, lost a disproportionate number of brain cells in regions important for memory and active thinking. "For the first time it shows what we see with our memory tests is confirmed by changes in the brain," said Osvaldo Almeida, professor of geriatric psychiatry at the University of Western Australia.
The smokers who failed to quit slid into mental decline twice as fast as non-smokers, but "those who quit, don't decline faster than those who never smoked", said Professor Almeida, a consultant at the Royal Perth Hospital where the patients were recruited.
"It's a good thing for your brain to quit," he said. "People who stop smoking, in terms of memory and cognitive function, do as well as people who never smoked."
While the quitters did show some grey matter loss 18 months after giving up, Professor Almeida said the damage was not significant and not in brain regions significant to cognitive impairment.
"The study suggests even later in life, if people stop smoking, it will have a positive effect on their brains," he said. The results, published in the journal NeuroImage, meant action could be taken to reduce the burden of Alzheimer's disease without waiting for a "magic pill".
"If people understand that smoking is an important risk factor … that contributes to progressive intellectual decline, this would hopefully reduce the burden of dementia in Australia."
Professor Almeida plans to continue testing participants at five-year intervals.
Kaarin Anstey, director of the dementia collaborative research centre at the Australian National University, said: "Before, people might have thought, 'it's too late for me … there's no point in me giving up because the damage is already done'. There's a very positive message in this research showing it's never too late and these quitters are getting a benefit, even in their 70s."
Australia: Useless health regulators
If people knew they were on their own they might be more cautious -- but they have the illusion of being protected by government regulation
The Therapeutic Goods Administration is calling for submissions as to how it can improve its transparency. Which is not before time, given that a recent spot check of 400 so-called alternative medicine products found that nine out of 10 breached TGA regulations.
I decided to put my own two bobs' worth in. In the interests of transparency, I reproduce an edited version of my own submission to the review here. It's not as good as the more detailed submissions from Choice magazine and the Consumers Health Forum of Australia, but I found writing it to be, well, therapeutic. I wrote:At the moment the TGA appears to operate as a rubber stamp for health fraud. It beggars belief that the "Aust L" "self-assessment" system allows people to sell untested products for medicinal use without having to provide so much as a jot of evidence for their efficacy.
The fact that a recent spot check found that 90 per cent of so-called complementary and alternative medicine products do not comply with regulations shows that the system is broken.
To help with transparency:
* All "Aust L" listed products should carry a warning box stating something to the effect of "The makers of this product have shown zero evidence that this product works for any disease or condition or that it can improve your general health or wellbeing."
Products that are known to be scientifically implausible should be labelled as such. The labelling should explain that scientific implausibility doesn't mean that "it works but we don't know how"; that it means "it can't possibly work as claimed unless much of our accumulated knowledge of physics, chemistry and physiology is completely wrong".
* The complaints process should not be so completely opaque. Some weeks ago I used the TGA's online complaints submission to submit a complaint (about the sale of colloidal silver for human consumption and the treatment of serious diseases in what seems a clear breach of the Therapeutic Goods Act). When I called to check up on the complaint a week later I was told it had never arrived so I had to re-submit it by email. Complaints submitted online should get a receipt number, like you do when you pay a bill online.
* People who make complaints should be informed of the result of the investigation - if indeed any investigation takes place. I have no idea whether my complaint will be investigated, and I get the impression that I will never find out unless I keep hassling people to check up on it.
* The TGA should be issuing a hell of a lot more advisories. The list of advisories is very short and threadbare. For instance, there is only one mention of homeopathic "immunisations" - a warning from 2002 not to use homeopathic "vaccines" for meningococcal disease. There is no warning about homeopaths selling useless "immunisations" for other diseases (including whooping cough, a serious respiratory infection that has been killing Australian children again in recent years). Also, there are no warnings about colloidal silver, ear candling and countless other things on which Australians are wasting their money and risking their health every day.
* There should be real consequences for breaching the Act. As long as the manufacturers and retailers of alternative nostrums feel free to scoff at the TGA they will continue to put people's health at risk.
What do you think? Do you think the TGA should do more to tell consumers that there is no evidence to support many of the remedies being sold in pharmacies, supermarkets and health-food stores? Do you think people who want to sell alternative remedies should have to prove that they work before they can be approved for sale? Submissions to the TGA review close tomorrow, by the way.
9 February, 2011
Study links junk food to lower IQ
Predictable rubbish. Without control for parental IQ it is meaningless. High IQ people might well have different diets. The journal article is: "Are dietary patterns in childhood associated with IQ at 8 years of age? A population-based cohort study". See also here
TODDLERS who have a diet high in processed foods may have a slightly lower IQ in later life, according to a British study described as the biggest research of its kind.
The conclusion comes from a long-term investigation into 14,000 people born in western England in 1991 and 1992 whose health and wellbeing were monitored at the ages of three, four, seven and eight and a half.
Parents of the children were asked to fill out questionnaires that, among other things, detailed the kind of food and drink their children consumed.
Three dietary patterns emerged: one was high in processed fats and sugar; then there was a "traditional" diet high in meat and vegetables; and finally a "health-conscious" diet with lots of salad, fruit and vegetables, pasta and rice.
When the children were eight and a half, their IQ was measured using a standard tool called the Wechsler Intelligence Scale.
Of the 4000 children for which there were complete data, there was a significant difference in IQ among those who had had the "processed" as opposed to the "health-conscious" diets in early childhood.
The 20 per cent of children who ate the most processed food had an average IQ of 101 points, compared with 106 for the 20 per cent of children who ate the most "health-conscious" food.
"It's a very small difference, it's not a vast difference," said one of the authors, Pauline Emmett of the School of Social and Community Medicine at the University of Bristol. "But it does make them less able to cope with education, less able to cope with some of the things in life."
The association between IQ and nutrition is a strongly debated issue because it can be skewed by many factors, including economic and social background. A middle-class family, for instance, may arguably be more keen (or more financially able) to put a healthier meal on the table, or be pushier about stimulating their child, compared to a poorer household.
Dr Emmett said the team took special care to filter out such confounders. "We have controlled for maternal education, for maternal social class, age, whether they live in council housing, life events, anything going wrong, the home environment, with books and use of television and things like that," she said.
The size of the study, too, was unprecedented. "It's a huge sample, it's much much bigger than anything anyone else has done," she said in an interview with AFP.
Dr Emmett said further work was needed to see whether this apparent impact on IQ persisted as the children got older.
Asked why junk food had such an effect, she suggested a diet that was preponderantly processed could lack vital vitamins and elements for cerebral development at a key stage in early childhood. "A junk food diet is not conducive to good brain development," she said.
The paper appears in the peer-reviewed Journal of Epidemiology and Community Health, published by the British Medical Association.
Testosterone in womb linked to autism risk
This is pure speculation. They had NO DATA on testosterone!
PERTH researchers have uncovered further evidence of a link between testosterone and autism, backing a theory that high testosterone exposure in the womb increases the risk of the disorder.
Researchers at Fiona Stanley's Telethon Institute for Child Health Research found that girls with autistic-like behaviours at age two had their first period about six months later than girls without the disorder's symptoms, The Australian reports.
"These findings indicate that exposure to testosterone in the womb may be regulating both autism-like behaviours and the age of first period and that this may play a role in clinical autism," lead researcher Andrew Whitehouse said.
He said the findings were linked to the so-called "male brain theory" of autism, which suggests the behaviour disorder is an extreme form of male mental traits.
"Autism is a real male-dominated condition; it affects around four males to one female, but there are also characteristics of people with autism that are more male-like," he said. "People have started thinking what might cause that, and the obvious candidate is male-type hormones and the most biologically active is testosterone."
The study looked at 383 girls who had no diagnosis of autism. At age two they were each given a rating for showing autistic-like behaviours such as avoiding looking people in the eye.
8 February, 2011
Even "junk" food can provide your daily dose of vitamins
The mountain of food surrounding me would send shivers up the spine of any right-thinking foodie. On my kitchen table sits an awesome pile of junk: frozen pizzas, long-life naan breads, industrial cheese, instant mashed potato, chocolate, peanuts and burgers, all of it amounting to four times my usual daily calories.
A diet made up solely of this stuff could, in the long-term, endanger your health by sheer volume of fat, salt and sugar. But it also has something very important going for it: this mountain of junk contains your entire Recommended Daily Allowance (RDA) of the 18 vitamins and minerals your body needs.
We’ve all seen RDAs on cereal packets and vitamin-supplement bottles, listing substances such as iron, niacin and pantothenic acid. If you’re anything like me, you’ve felt a warm glow of smugness when you’ve read on the side of your Cheerios that ‘one 30g serving provides 25 per cent of your RDA of riboflavin’. How lovely, we think — until it dawns on us that we have no idea a) what riboflavin is, b) what it’s for and c) what other food we’d find it in.
So, many of us throw money at the problem in the form of expensive multivitamin pills in a desperate attempt to hit our RDAs, which is why the supplement industry in the UK is worth £396 million a year.
But do we really need RDAs? I set out to investigate what they mean, and whether it’s really possible to hit these nutritional targets with daily diet.
It turns out that RDAs are of questionable use, and have largely ended up as a way for food manufacturers to boost sales. And incredibly, you can get all your RDAs from junk food — as illustrated by the U.S. study, out this week, that found dark chocolate contained more antioxidants and healthy plant compounds than fruit juice.
RDAs were first cooked up in the darkest days of World War II by U.S. doctors who wanted to know the minimum rations their servicemen could live on while based in England, preparing for D-Day. Food was shipped over in convoys braving the U-boat infested Atlantic, so it was imperative sailors’ lives were not risked unnecessarily.
At the end of the Forties, British doctors went a step further, concocting an average required intake of vitamins to maintain daily health based on post-war rationing.
Since then, the guidelines have been reviewed several times. They were formalised by the European Commission in 1991, but in the same year the Department of Health replaced them with a much more comprehensive but slightly baffling system that uses DRVs (Dietary Reference Values), which give recommendations based on age, sex and health.
But it’s RDAs that are still used on the side of your cereal packet — even though all those percentages and milligrams are outdated and, for many people, completely misleading. For instance, a healthy male adult needs eight times more iron than a three-month-old baby (and an iron overdose can kill).
I asked Nestle, who make my Cheerios why they still include RDAs on their packaging, but they failed to respond before going to press. But, I hear you cry, RDAs are on food labels because they’re legally required, so surely they must be right?
When I contacted the Food and Drink Federation about this, they said it’s only if food manufacturers state the amount of vitamins and minerals on the packaging that they’re legally required to provide them as a percentage of the RDAs. RDAs are actually compulsory only for a limited range of products such as supplements and infant formula, says Peter Berry Ottaway, a leading British consultant in food sciences for more than 30 years. ‘Major supermarket groups have been forcing suppliers to put labelling on everything, as they believe that it improves sales,’ he adds.
So how much notice should we take of RDAs?
After I had chased various different agencies, the Department of Health finally responded to my questions. Basically, RDAs do exist, but officially we don’t really use them any more.
It’s all very confusing — there’s no easy-to-access information that the public can find. Official dietary advice tells us that we should be able to get all our vitamins and minerals from a healthy diet and shouldn’t need supplements.
I wanted to find out if it’s really possible to eat my full wallop of RDAs — not just from the oyster and guava fantasy diets dreamt up by advertising executives, but from the food real people like you and me eat when we’re busy, tired at the end of a long day, or desperate for a snack. Which is why I find myself heading for Morrisons in North London to buy my RDA.
One of the first things I discover is that many of these vitamins and minerals can be pretty tricky to get from a normal, healthy diet. For example, I’d have to eat 14 large portions of peas to get my daily intake of magnesium — or 300g of dark chocolate. Hmm, I know which one I think I can manage.
You can get your full dose of vitamin C from two-and-a-half portions of chips — or one-and-a-quarter bags of watercress. In fact, the junk food often has a gratifying amount of good stuff.
Once home, I begin my challenge with lunch, cracking open my pint of Guinness (for my full RDA of vitamin B12).
Next, I start work on two portions of breaded haddock (for my phosphorus RDA) and two-thirds of a pizza (the dough gives me my full dose of calcium). I also have my first of the day’s Burger King Whoppers (iron) and a large portion of spinach (folic acid).
Still feeling full, I sit down to tea. Two-and-a-half slices of Victoria sponge (vitamin D), a pint of milk (riboflavin) and a thick sarnie of salmon paste (iodine) later, I’m feeling listless and slovenly.
At supper, I plod through four naan breads (vitamin E) and four scoops of instant mash (fortified with vitamin C) as my two young daughters watch in awe. This turns to irritation as I refuse to let them help me with three bars of dark chocolate (magnesium).
I put three tablespoons of cream cheese (vitamin A) into the remaining one-and-a-half Whoppers. A slice of liver gives me a tasty whack of zinc, and I wash it all down with a nice cup of Bovril (thiamin). I snack on six bananas for my vitamin B6 (one vitamin it’s hard to find in junk food) as I watch a movie, and by 11pm I think I’ve made it — an hour early.
But then I spot a large pack of roasted peanuts I’d forgotten. I’m tempted to skip them until I realise I have to eat them for the biotin, pantothenic acid and niacin. A shade before midnight, I raise my arms in a silent cheer: I’ve done it!
The next morning, as I sat down to write up the whole sorry episode, I felt pain, shame, lethargy and a fair amount of flatulence. But on the positive side, I have been enlightened. The extraordinary thing about my day’s diet was not just that I had managed to eat my entire RDA of all 18 vitamins and minerals, but that those nutrients came from unlikely sources.
Nutritionists may dismiss convenience food and naughty snacks as rubbish food with ‘empty calories’, but that’s often not the case. As it turns out, loads of foods have lurking goodness, even though they also have high concentrations of fat, salt or sugar.
So should we ignore RDAs? Yes. If looking at the side of your cereal packet makes you feel happy, that’s great, but, weirdly, a good chunk of our minerals and vitamins can be in ‘junk food’ — so a bag of chips can be part of a well-balanced diet.
As for supplements, it may be useful to take notice of RDAs if you’re pregnant, sick or elderly, but unless your doctor says you need them, don’t throw your money away. Spend it on better food, instead.
Now, I don’t want people to eat more burgers and chips than they already do, considering obesity levels. But I’m also wary of the nutritional Nazis who try to spread fear and guilt, while flogging expensive nuts, snake oil and supplements.
A good diet is one that covers a broad range of food groups, doesn’t make you overweight, and makes you happy. So eat well and broadly and certainly don’t take too much notice of RDAs — they’re all going to change again in November.
Homosexuals to be immunized against anal cancer
The immunisation program that protects girls against the virus linked to cervical cancer should immediately be extended to boys to prevent other cancers, a leading epidemiologist says.
Vaccinating boys against the human papillomavirus (HPV) would help stem a drastic rise in some cancers, particularly among homosexual men, said Andrew Grulich, the head of the epidemiology and prevention program at the National Centre in HIV Epidemiology and Clinical Research at the University of NSW.
The Pharmaceutical Benefits Advisory Committee will consider next month an application to provide the Gardasil vaccine free to boys.
About 90 per cent of all anal squamous cell carcinomas are caused by infection with HPV. But an unwillingness to discuss the disease had led to a lack of awareness and research, said Professor Grulich, the senior investigator on the project.
Anal cancer had increased by about 3.4 per cent annually in men and 1.9 per cent in women since 1982, according to the study published in the journal Vaccine.
Unpublished research by Professor Grulich and his team indicated that in some inner-city suburbs the rate of anal cancer was up to 30 times higher than in the general population.
He said the federal government should immediately include boys in its free HPV vaccination program. "But we do have to recognise that even if we do that, just as it is for women, it could be 20 to 40 years before the maximum benefit is obtained," he said.
He was developing a screening program to detect the early signs of problems caused by HPV.
Anal cancer linked to HPV infection occurred most commonly among women, many of whom said they did not have anal sex, Professor Grulich said.
7 February, 2011
Some babies' brains damaged by mothballs
So they want to ban a very common and very effective household precaution. Why do we all have to suffer for the benefit of a few? Should not the at-risk groups be responsible for avoiding whatever is harmful to them? And whatever the do-gooders suggest in place of napthalene will undoubtedly be found to be also problematical eventually (as with trans fats). I'm going to stock up on mothballs. I have been using them to good effect for decades
EXPERTS have called for a ban on the sale of mothballs containing the chemical naphthalene, warning that they pose a risk of severe blood problems and even brain damage for significant numbers of Australian babies.
About 5 per cent of Australians of Asian, African, Middle Eastern or Mediterranean descent have an inherited enzyme deficiency, and affected babies can suffer blood-cell breakdown if placed too close to fabrics stored with naphthalene mothballs.
In severe cases this causes jaundice and the yellow pigment linked to jaundice, called bilirubin, can build up in the brain. This causes a condition called kernicterus, triggering neurological problems and sometimes brain damage, The Australian reports.
Australians of predominantly Anglo-Saxon or indigenous background are less commonly affected, but naphthalene can also cause red cell breakdown in those without the G6PD deficiency.
In a letter to the Medical Journal of Australia published today, three pediatric experts from Sydney and one from Christchurch say at least three babies suffered from the brain complications in the past three years. One of them died.
William Tarnow-Mordi, director of the Westmead International Network for Neonatal Education and Research at the University of Sydney, said affected babies could develop massive breakdown of their red blood cells within hours of being wrapped in clothing stored with mothballs containing naphthalene.
"The lifetime costs of caring for a baby with kernicterus are many millions of dollars," Professor Tarnow-Mordi said.
The European Union banned the supply of naphthalene products in 2008, and Professor Tarnow-Mordi said he and other experts were working with the Australian Pesticides and Veterinary Medicines Authority to see whether similar action should be taken in Australia.
"Health authorities in Australia already inform parents about the dangers of mothballs with naphthalene. Without further measures, more babies could sustain brain damage or die," he said. "A total ban on mothballs with naphthalene may now be the safest course."
A gene protects some U.S. blacks from heart disease
Some black Americans have a gene that protects them from heart disease, researchers said on Thursday. About a quarter of African-Americans carry the protective gene, and if they are lucky enough to have two copies, one from each parent, their risk of heart disease is 10 times lower that of other blacks.
People with just one copy have five times lower the risk of heart attacks, blocked arteries and other symptoms of heart disease, the team reported in the Journal of Human Genetics.
"What we think we have here is the first confirmed hereditary link to cardiovascular disease among African-Americans and it is a protective one," Diane Becker of Johns Hopkins University in Baltimore said in a statement.
The same gene has been studied in people from Japan, South Korea, Europe and elsewhere but not in black people. In fact, few such genetic studies have been done in blacks at all, the researchers said.
The gene is called CDKN2B, and certain mutations raise the risk of heart disease. For instance, Swedish people with one version of the gene were more likely to have strokes if they also had high blood pressure.
Becker's team studied 548 healthy African-American brothers or sisters of people with documented heart disease. They had their genes sequenced and were followed for 17 years.
The researchers noticed a certain type of mutation called a single nucleotide polymorphism, SNP (pronounced "snip") for short, on a gene that had been linked to heart disease in studies of people in Korea, Italy and elsewhere. About 25 percent of the volunteers in Becker's study had this protective version of the gene, and 6 percent had two copies.
Becker's team said when genetic tests became more common it might be worth testing blacks for which version of the gene they have, so those without the protective mutation could be more closely monitored for heart disease, the No. 1 killer in the United States and most other developed countries.
It is becoming more common to use ethnic origin to define disease risks. In 2005, Nitromed Inc. got U.S. Food and Drug Administration approval to market its heart drug BiDil specifically to blacks after it was shown to benefit African-Americans more than whites.
Studies have shown that African-Americans are less likely than whites to be prescribed heart drugs or receive bypass surgery, although blacks have an overall greater risk of heart disease than whites.
6 February, 2011
Milk Fascists in Australia too
Why can't the do-gooders let people take their own risks if they want to? I drank raw milk for a time in my childhood with no visible ill-effects. It did taste better. Most of us kids got TB from it but hardly noticed. We were all healthy country kids so it was just another childhood illness akin to flu which came and quickly went -- leaving us immunized against TB for the rest of our lives. The milk was a very pleasant vaccine.
Dairy inspection standards are now however much stricter than the negligible ones of my far-off childhood so any infection these days is a tiny risk -- and we all take risks
The thirst for raw milk straight from the cow's udder has created a clandestine market among consumers who say it is healthier and tastes better. However, food authorities are determined to stamp out what they say is a highly dangerous and illegal practice.
Peter Melov, of Bondi, was recently fined $53,000 for selling raw milk and raw-milk products through a now-defunct organisation, Global Sov. The products were sold online and at an organic food market in Bondi Junction. "Everyone was coming in asking us for raw milk, and a few shops in Bondi had it, so I thought 'I'll just sell it'," Mr Melov said.
Selling unpasteurised milk and cheese for human consumption is illegal, but it is available to buy under names like "bath milk" in certain health-food shops and markets. The Sun-Herald understands some raw-milk aficionados have exploited this apparent loophole, buying "bath milk" for drinking. Primary Industries Minister Steve Whan said companies selling raw-milk products were putting lives at risk. "There is sound scientific evidence pointing to the risks associated with consuming raw milk," he said.
Mr Melov, who was found guilty in Downing Centre Local Court of 43 breaches of the Food Act, said he had received no complaints from customers, and the NSW Food Authority had not warned him that he was doing anything wrong. He would not risk selling raw milk again, he said. "It was like we had been dealing drugs. "If we had just got a phone call, we would have complied completely with the Food Authority."
Medical microbiologist Dr Vitali Sintchenko, of Westmead Hospital, said there were sound reasons why selling raw milk was banned. "There are potential pathogens and toxins present in raw milk that can be life-threatening," he said.
Cheesemaker Will Studd has advocated changes to the legislation banning raw milk. "If we have such a healthy dairy industry, what is everybody so concerned about?" he asked. "Why aren't consumers allowed to enjoy milk in its natural state?" With the right regulation, there would not be any alarm about consuming raw milk and its products, he argued.
Fellow cheesemaker Franck Beaurain does not think it is necessary to relax existing regulations. "I really believe you can do a good job with pasteurised milk. I can't say it [raw milk] tastes better than pasteurised."
Low energy lightbulbs 'could harm 40,000'
Low energy light bulbs could exacerbate the health conditions of up to 40,000 people across Britain, a minister has said. Anne Milton, the public health minister, made the admission after Labour MP Mark Tami asked if the Department of Health had made an assessment of their effects on people with sensitive eye conditions.
Mrs Milton referred to a report by the European Commission's Scientific Committee on Emerging and Newly Identified Health Risks (SCENIHR), which found up to 250,000 people across the EU with certain eye, photosensitive and neurological conditions could be at risk. She said: "Firm figures for the United Kingdom are not available, but the SCENIHR statistics would equate to around 30,000 to 40,000 people that might be affected in the UK."
A 2008 review by the Health Protection Agency warned of the ultra-violet rays emitted by compact fluorescent lamps (CFLs), and the flickering nature of the light they gave off. The former can trigger rashes in a small number of people as well as lupus, an autoimmune disease whose symptoms include fatigue and joint pain. The latter can induce eye pain and could even increase the incidence of repetitive behaviour in autistic people.
Mrs Milton added: "The Department is continuing to work with the HPA, patient groups, clinicians and the lighting industry to keep the health issues under review." [Good of them!]
5 February, 2011
Food campaigners face fury over sick spoof of Al Qaeda video in which Ronald McDonald is 'held hostage'
Campaigners for ethically-produced food provoked fury today by releasing an Al Qaeda-style spoof video in which they are seen holding Ronald McDonald hostage.
Five members of the Finnish group, who call themselves the Food Liberation Army, are seen in black balaclavas with the fictional McDonald's figurehead distinctive in the foreground, despite wearing a hood.
The spokesperson threatens to execute the character if the fast food chain refuses to answer questions about how it produces its products.
McDonalds told MailOnline today that the stunt was irresponsible and 'in very poor taste'. It also denied the campaign group's suggestion that it was attempting to hide details about its food quality and manufacturing processes.
A spokeswoman said: 'McDonald’s is always available to engage in constructive conversations with our customers, stakeholders and the media. 'This stunt is in very poor taste and not a responsible approach to meaningful dialogue. 'Meanwhile, we are focused on our customers and are fully transparent about our high quality food and industry-leading standards and practices.'
The video sees the FLA demand that McDonalds release information about its manufacturing process, and the additives used in its product. The campaign group also asks why McDonalds will not release figures detailing how much unrecycled waste it produces each year.
Speaking in Finnish, with an English translation running across the bottom of the screen, the spokesperson says: 'We are a Food Liberation Army, and we hope that this extreme action will take us towards a better and safer food future.
Bone drug 'could give you another five years of life'
Highly speculative. With only 35 men and 106 women on biophosphonates and no attempt at sampling, this could very easily be a rogue result. See the comments in the last paragraph below. Journal article here
A drug taken by hundreds of thousands of women for osteoporosis could extend life by up to five years, scientists claim. Bisphosphonates – the most commonly prescribed treatments for the bone-thinning disease – have been shown to reduce death rates by as much as 80 per cent among those over 75.
Researchers believe they could extend an elderly person’s life by around five years. They suspect that the powerful drugs, which prevent the loss of bone, may also reduce the levels of toxic substances in the body which can cause cancer, high blood pressure and other illnesses.
Around 553,000 patients in Britain are currently taking bisphosphonates, which are sold under brand names such as Actonel, Reclast and Boniva. The majority are prescribed to post-menopausal women suffering from osteoporosis, but they are also used to treat those with arthritis and certain types of cancer.
Scientists in Australia looked at 2,000 volunteers over the age of 70, including 121 who had been taking bisphosphonates for an average of three years. They found that after five years the death rate among those who had been taking the drugs was 80 per cent lower than average.
The scientists believe that by reducing bone loss, the drugs may be preventing the release of toxic substances such as lead into the blood. Lead gradually accumulates in the body from petrol and paint on walls and is stored in the bones for decades. But when the bones break down and it is released into the bloodstream it can be harmful and has been linked to high blood pressure, kidney disease and memory loss.
The study, published in the Journal of Clinical Endocrinology and Metabolism, also suggests that the bones store another toxic chemical, cadmium – naturally found in soils – which has been linked to lung cancer.
Associate Professor Jacqueline Center, from the Garvan Institute of Medical Research in Sydney, said: ‘While the results seemed surprisingly good, they are borne out by the data – within the limitations of any study – and appear to apply to men as well as women.’
But others were sceptical of the findings as the study only involved a small number of participants. A spokesman for Arthritis Research UK, which represents a number of patients prescribed the drug for bone-thinning, said: ‘It was quite a small study, and much bigger randomised controlled trials have not shown that bisphosphonates extend lifespan.’
4 February, 2011
"Healthy" lifestyle 'would prevent 79,000 cancers a year' (?)
The WCRF publicity machine hard at work again in the report below. Their "estimates" are based on an uncritical acceptance of epidemiological claims -- all of which become moot under close examination. The WCRF is just a donation-fed private club that needs to keep making scary announcements to get donations -- a lot like the SPLC
Almost 80,000 cancers could be prevented in Britain every year if everybody followed a healthy lifestyle, the World Cancer Research Fund (WCRF) has estimated.
Breast and bowel cases could be almost cut in half, with respective reductions of 42 and 43 per cent, the charity has calculated.
The changes would involve people taking regular exercise - such as walking for half an hour a day - maintaining a healthy weight and eating a healthy diet.
Studies have consistently found relatively few people recognise that taking such measures can cut the risk of a range of cancers.
Professor Martin Wiseman, medical and scientific adviser for the WCRF, said: "It is distressing that even in 2011 people are dying unnecessarily from cancers that could be prevented through maintaining a healthy weight, diet, physical activity and other lifestyle factors. "We still a have chance to avert a big increase in cancer cases in the UK, so we urge the public and the Government to make cancer prevention a public health priority."
The estimate that 79,000 cancers could be prevented nationwide, wase released to mark World Cancer Day.
Every year almost 300,000 cancers are diagnosed in Britain, according to the Office for National Statistics. Cancers kill more than 150,000 people a year, making it the second biggest killer after heart disease.
Junk food companies told British Government wants to avoid 'intrusive' laws
The Health Secretary has told junk food manufacturers he wants to avoid "intrusive, restrictive and costly regulation". Andrew Lansley said to senior executives from companies including Mars and PepsiCo that ministers were not interested in "nannying" people about their food choices.
His comments come despite criticism of the Government’s decision to roll back spending on the Change4Life health campaign, in favour of getting commercial companies and charities to fill the gap. Cadbury, Unilever, Coca-Cola, Kellogg's, Kraft, Mars, Nestle and PepsiCo have all been involved, alongside Britvic and supermarket giant Tesco.
At a public health debate held by The Grocer trade magazine on Wednesday, Mr Lansley said there had been an imbalance in health policy for years, with "big state dominating over big society". Now there needed to be a "wide range" of interventions to achieve improved health - regulation was just one option.
Mr Lansley said he would not be "scared" to use industry to achieve public health and commercial aims, because companies could reach consumers in a way ministers could not. He added: “Hopefully there will be no need for intrusive, restrictive and costly regulation.”
The Health Secretary also admitted he himself falls outside the healthy weight range. "I know that I'm overweight," he said. "I know that I should have a BMI of 25 and I'm a BMI of 28."
Tam Fry, of the National Obesity Forum, said: "Mr Lansley's reassurance to junk food manufacturers beggars belief. "As Health Secretary, his duty is to reassure the public that the food we are asked to choose from is as healthy as it can be.”
Jacqui Schneider, of the Children’s Food Campaign, added: “There’s absolutely no evidence that industry – which has a vested interest – is able to bring about a change in people’s behaviour.”
3 February, 2011
How coffee can boost the brainpower of women... but scrambles men's thinking
The heading above reflects the heading of the academic journal article concerned but is a poor reflection of the journal abstract, which follows:
"We tested whether increased caffeine consumption exacerbates stress and disrupts team performance, and we explored whether “tend and befriend” characterizes women's coping. We gave decaffeinated coffees, half of which contained added caffeine, to coffee drinkers in same-sex, same-aged dyads. We measured individual cognitive appraisals, emotional feelings, bodily symptoms, coping, and performance evaluations, together with dyad memory, psychomotor performance, and negotiation skills under higher or lower stressful conditions. Evidence consistent with the first hypothesis was weak, but we found that women performed better than did men on collaborative tasks under stress, provided caffeine had been consumed. The usefulness of multi component, cognitive-relational approaches to studying the effects of caffeine on stress is discussed, together with special implications of the effects for men"
It would appear that the authors have over-interpreted their results. What they found was that women were better at collaborative tasks, which is not big news, given the female specialization in socio-emotional relationships. What is interesting is that women needed caffeine to bring out their greater abilities in that respect. One would have thought that they would be superior with or without caffeine. I suspect poor experimental design -- unrepresentative sampling etc. There is no obvious reason why caffeine should affect men and women differently
Next time you have a high-pressure meeting at work, keep an eye on what goes into your colleagues’ cups. Drinking coffee improves women’s brainpower in stressful situations – but sends men into meltdown, according to a study. While sipping a cappuccino or downing an espresso boosts women’s performance when working with others, the same drinks impair men’s memories and slow their decision-making.
And given that Britons get through some 70million cups of coffee a day, the implications are significant, say the researchers.
Psychologist Dr Lindsay St Claire said: ‘Many meetings, including those at which military and other decisions are made, are likely to be male-dominated.
‘Because caffeine is the most widely consumed drug in the world, the global implications are potentially staggering.’
The researchers, from Bristol University, wanted to examine what coffee does to the body when it is already under stress, such as during a tense meeting.
They recruited 64 men and women and put them in same-sex pairs. Each pair was given a range of tasks to complete, including carrying out negotiations, completing puzzles and tackling memory challenges, and told they would have to give a public presentation relating to their tasks afterwards. Half of the pairs were given decaffeinated coffee, while the others were handed a cup containing a large shot of caffeine.
The researchers found that men’s performance in memory tests was ‘greatly impaired’ if they drank the caffeinated coffee. They also took an average of 20 seconds longer to complete the puzzles than those on the decaffeinated coffee.
But women completed the puzzles 100 seconds faster if they had been given caffeine, the Journal of Applied Social Psychology reports.
New antibiotic weapon in war on hospital superbug C.diff
A new antibiotic could transform treatment of hospital patients with life-threatening C. diff infections. A study shows Fidaxomicin – the first advance in the field for decades – cuts the rate of repeat infections by 45 per cent compared with an existing antibiotic. It also shortens the duration of diarrhoea symptoms affecting hundreds of NHS patients each year.
Experts claim the drug helps preserve the natural ‘good’ bugs in the intestine that are normally wiped out by diarrhoea and the action of conventional antibiotics.
C. diff is among the more virulent of hospital-acquired infections, with more than eight out of ten victims of C. diff aged 65 and over. Almost one in three people contracting the debilitating stomach bug suffers recurrent illness because current treatment fails to eradicate it.
A study of 629 patients published in the New England Journal of Medicine marks the final stage of investigation of the drug prior to licensing for clinical use.
Study co-leader Dr Mark Miller, head of the Division of Infectious Diseases at the Jewish General Hospital in Montreal, Canada, said: ‘There wasn’t much interest in C. difficile for many years, because it wasn’t considered a serious disease.
'However, over the past decade the bacterium has mutated into something much more serious that has caused epidemics worldwide.’
Hospitals have installed alcohol hand gel dispensers to reduce the risk from superbugs
Fidaxomicin, developed by Optimer Pharmaceuticals of San Diego, is the first in a new class of antibiotics. It is only minimally absorbed from the gut into the bloodstream, which means its killing power is specifically targeted at C. diff in the intestine.
Patients taking part in the phase 3 trial were randomly allocated oral treatment with Fidaxomicin or the antibiotic Vancomycin for ten days. Just 15 per cent of those treated with the new drug suffered recurrences compared with 25 per cent of those given Vancomycin, a 45 per cent reduction.
The length of illness was shorter for those on Fidaxomicin and the low rate of side effects was similar in both groups.
Dr Miller said: ‘Anybody who knows C. difficile recognises that recurrences are the major problem with this disease. 'Anything that can reduce the recurrence rate, especially as dramatically as Fidaxomicin, is a very important milestone in the treatment of C. difficile.’
Latest figures show there were 783 C. diff infections in hospitals in England in December 2010. There were almost 13,000 hospital infections recorded between October 2009 and 2010.
2 February, 2011
Fat is good for you -- and organic food isn't
Eating is good for you – and eating a lot is even better. There, I've said it; but before the obesity police take me away for re-education in the cells beneath the Department of Health, I offer – as evidence in my defence against high treason in the war against fat – some findings published in the latest edition of Nutrition Journal, the results of a survey of 350,000 randomly selected Americans.
The analysis, in this highly respected academic publication, concludes that overweight people live longer lives, and that those who are obese in old age tend to live longer than those who are thin. They are also, say the researchers from the University of California, more likely to survive certain dangerous medical conditions such as heart disease, renal failure and type 2 diabetes.
The lead researcher, Dr Linda Bacon, concludes: “It is overwhelmingly apparent that fat has been highly exaggerated as a risk for disease or decreased longevity. For decades the US public health establishment and the $58.6bn a year private weight-loss industry have focused on health improvement through weight loss. The result is unprecedented levels of body dissatisfaction and failure in achieving desired health outcomes.”
Hallelujah and pass the cream! Actually Dr Bacon’s observations elide two very different phenomena, which should be treated separately. As she points out, there is the publicly funded campaign (as bloated here as it is in the US) which spends countless millions telling us that fat in food is a terrible thing because it will kill us off prematurely – an assertion for which there is very little evidence, but which has the status of holy writ.
Then there is the obsession with being thin, a phenomenon which is overwhelmingly the province of well-to-do women; this has nothing to do with health, and does not make any claims to be beneficial, in the medical sense. It is about image rather than content – and no less significant for that.
This distinction was brilliantly encapsulated by HG Wells in his short story “The Truth About Pyecraft”. Mr Pyecraft is a morbidly obese man who constantly complains to a fellow member of a London club, Mr Formalyn, about the misery caused by his excessive weight. Eventually Formalyn gives Pyecraft a secret weight-loss formula. It works: after taking it, Pyecraft floats up to the ceiling, where he remains helplessly until he is handed several volumes of Encyclopaedia Britannica, which give him the necessary ballast to return to earth. Wells was making the distinction between weight – or mass, if you prefer – and fatness.
Thus, while the average dieter says that she wants to lose weight, what she really means is that she wants to be a different shape; it has nothing to do with being a “healthy weight” (whatever that is); and indeed such a dieter would almost certainly give up the last five years of her life expectancy in exchange for what she would regard as a perfect figure – although admittedly that view might be amended when the Faustian payment became due.
Nevertheless, because thinness is, under the modern Western dispensation, seen to be beautiful, and because beauty has always been regarded (with no basis in fact) as a visual manifestation of underlying virtue, dieting has become a form of self-sanctification, of moral purification.
It has sometimes been argued that in an increasingly irreligious age, the body has become the most fashionable temple, to which devotion is paid in a peculiarly self-obsessed manner, to the benefit of no one but the owner.
In fact, religion and diet have always been closely linked. Fasting – still practised in one form or another by strictly observant adherents to each of the great Abrahamic faiths – is the most obvious example; but religious dietary laws go well beyond that, as Jews and Muslims can attest. Although some claim that the dietary rules of what had once been desert tribes can be explained by what would have been common sense and hygiene in the days before refrigeration, the truth is that such rules were never intended to be for the benefit of the physical health of adherents. They were always about being virtuous, in this case by obeying God’s word unquestioningly, and to be part of a group who by eating in a certain (highly constrained) way, were closer to that God.
There really is not such a vast difference, psychologically, between those ancient faith identities and the modern cults of food faddism. The latter, too, identify certain food items as inherently and absolutely “bad”, quite distinct from any nutritional or scientific evidence. Vegans are an extreme example of this tendency, in that they are able to judge all those who do not observe their own strictures as being participants in a form of collective wickedness. Yet other, less ascetic, brands of nutritional nonsense are also little more than food cults – for example, the so-called “organic” movement, which has as its high priest the future head of our established church, the Prince of Wales.
While every meta-analysis conducted over decades has failed to demonstrate that “organic” food has any superior nutritional properties to that grown with the aid of chemical fertilisers, the adherents to this cult remain unalterably convinced that they and their children will live longer and healthier lives than those who do not adhere to the same quasi-religious dietary law. They believe that so-called “non-organic food” – in reality there is no such thing – gives people cancer; yet since 1950, as pesticides and industrial farming have taken an increasing role in food production, stomach cancer rates have actually declined by 60 per cent in Western countries.
Back in 1974, Dr Lois McBean and Dr Elwood Speckmann produced what remains the most incisive demolition of the pseudo-nutritional cults, in a paper on what they termed “food faddism”.
Their list of its typical adherents bears repeating today: “Miracle seekers or those who adhere to an uncritical belief in bizarre or unrealistic promises; ritual or authority seekers; those pursuing ‘super’ health; paranoiacs; ‘truth’ seekers; fashion followers ? and the ‘afraid’ who are anxious about the uncertainties and threats of living.”
That last category is the most telling; it makes the point that those who are frightened by food are frightened by life itself. Eat up – and enjoy yourself.
Popeye had it right: spinach really does make you stronger
A good journal summary: "A decline in mitochondrial function occurs in many conditions. A report in this issue of Cell Metabolism shows that dietary inorganic nitrates enhance muscle mitochondrial efficiency. It is an attractive hypothesis that dietary changes enhance energy efficiency, but its potential application depends on long-term studies investigating net benefits versus adverse effects"
Popeye's taste for a can of spinach before a fight has a genuine scientific basis, researchers have found: the leafy green vegetable really can boost your muscle power.
It was thought the iron content of spinach accounted for its status as a superfood. But researchers at the Karolinska Institute in Sweden have found that the inorganic nitrate it contains is the secret of its strength-giving property.
They gave a small dose of nitrate, equivalent to that found in a plate of spinach, to exercising volunteers and found it reduced their need for oxygen. Their improved muscle performance was due to increased efficiency of the mitochondria that power cells, the researchers wrote in Cell Metabolism. They suggest their finding could offer one explanation for the health benefit of eating fruit and vegetables, the mechanism of which still remains unclear.
Great efforts have been made to isolate the key ingredients of a healthy diet, with limited success. Nitrates could be one of those key ingredients. They interact with bacteria in the mouth to produce nitric oxide, a physiologically important molecule with a key role in lowering blood pressure.
1 February, 2011
Walking linked to better memory
This looks good at first but no brain area has one simple function and a larger hippocampus could have many implications. The fact that only in one group did increase in size relate to improved cognition quite overturns the generalization attempted. Something much more complex is going on
A section of the brain involved in memory grew in size in older people who regularly took brisk walks for a year, US researchers reported today. The new study reinforces previous findings that aerobic exercise seems to reduce brain atrophy in early-stage Alzheimer's patients, and that walking leads to slight improvement on mental tests among older people with memory problems.
The new analysis, led by researchers at the University of Pittsburgh and University of Illinois at Urbana-Champaign, appears in today's edition of the Proceedings of the National Academy of Sciences.
The study involved 120 sedentary people, ages 55 to 80. They were divided into two groups. Half began a program of walking for 40 minutes a day, three days a week to increase their heart rate; the others only did stretching and toning exercises.
The hippocampus, a region of the brain involved in memory, tends to shrink slightly with age and that's what happened in the group that only did stretching. But among people who took part in the walking program, the hippocampus region of the brain grew in size by roughly 2 per cent.
Researchers found that there was some memory improvement in both groups, but "in the aerobic exercise group, increased hippocampal volume was directly related to improvements in memory performance".
"We think of the atrophy of the hippocampus in later life as almost inevitable," Kirk Erickson, professor of psychology at the University of Pittsburgh and the paper's lead author, said.
Art Kramer, director of the Beckman Institute at the University of Illinois and the senior author said: "The results of our study are particularly interesting in that they suggest that even modest amounts of exercise by sedentary older adults can lead to substantial improvements in memory and brain health."
Dr Jeffrey Burns of the neurology department at the University of Kansas School of Medicine, said he was "enthusiastic" about the paper. Dr Burns, who wasn't involved in the new research, said that while previous studies have pointed to the relationship between exercise and memory, this rigorous, year-long study advances what's known about the brain and exercise.
Where's the Beef?
By all accounts, Taco Bell is a story of success. Since Glen Bell opened the first Taco Bell in Downey, California in 1962, the franchise has expanded to 6,446 restaurants with over 175,000 employees worldwide. In 2009, the company (which is currently owned by Yum! Brands) brought in $1.9 billion in revenue. It is no secret why this restaurant has experienced such growth. Like its rivals in the fast food industry, Taco Bell specializes in offering meals to its customers at the cheapest possible price. Today, the company is under attack by a publicity-seeking law firm and a media that is all-too-eager to exploit any potential controversy, no matter how frivolous. What should be a story about how a private business feeds millions of people for what amounts to pocket change is instead a pseudo-investigation into what qualifies as ground beef.
No one has ever gone into a Taco Bell under the illusion that they were purchasing quality food, because we are all aware that you cannot stuff 460 calories into a burrito and charge 99 cents without sacrificing something. Its cheapness is the foundation of its appeal, and even the company acknowledges this fact with its advertising slogans “Big Variety, Small Price,” and “Why Pay More?” The choice to offer quantity over quality has not come without damage to its reputation, however. For years, people have made jokes about the poor quality of their food and speculated about “what was really in the meat,” but over 36.8 million customers in the United States continue to eat there every week anyway.
On January 19, Montgomery-based law firm Beasley Allen filed a class action lawsuit in a California court alleging that Taco Bell’s beef does not meet the U.S. Department of Agriculture’s minimum requirements to be defined as “beef,” and that therefore the company is guilty of false advertising. Taking actions that I myself have advocated in a previous column, Beasley Allen had Taco Bell’s “meat mixture” independently tested and found it to contain less than 35 percent beef. Other ingredients include water, wheat oats, soy, maltodextrin, anti-dusting agent, corn starch, and silicon dioxide. Wheat oats and soy are both perfectly edible and inoffensive, and maltodextrin (though it sounds bad) is a food additive made from starch found in dried taco seasoning as well as soda, candy, and even beer. Silicon dioxide is a common food additive used to prevent ingredients from coagulating.
Unlike lawsuits related to food safety, however, this particular lawsuit is a farcical use of the legal system with no basis in consumer protection. On its website, Taco Bell displays all ingredients, along with nutrition information, for every item on the menu. There is no secret to what is in its “meat mixture.” The only real basis for this lawsuit, the 35 percent ratio of beef to other ingredients, involves, frankly, a bizarre and arbitrary guideline established by the U.S. Department of Agriculture, which allows for at least 40 percent meat in any products labeled “meat.” Once you allow for anything less than 100 percent of a product to be defined as that product, what does it really matter what percentage it is? According to the USDA, I could concoct a mixture consisting of 40 percent ground beef and 60 percent cookie dough and call it meat. This says more about uselessness of the USDA as a regulatory body than it does about Taco Bell’s ingredients.
As a company, Taco Bell is transparent about its products. Anyone at any time can go on its website, or do a little research, and find out what is being stuffed in their tacos. None of the ingredients are harmful, and no one has ever been led to believe that they are eating a gourmet meal when they go there. Ground beef is expensive, and so fillers like wheat and soy are mixed in to get the most out of their product. That is why a taco at Taco Bell is so much cheaper than one at Chipotle. One restaurant offers quantity and the other offers quality, but you get what you pay for and there is plenty of room in the marketplace for both.
We should be celebrating the fact that innovation and entrepreneurship has brought a wide variety of food options to the table for people of all economic backgrounds, and not attacking a company for providing cheap food at a cheap price. Instead, law firms should focus their litigation on serious issues of food safety and workplace standards in fast food establishments. How much beef is in a Taco Bell taco? Less than 99 cents worth, and that’s all anyone has ever paid for.
SITE MOTTO: "Epidemiology is mostly bunk"
Where it is not bunk is when it shows that some treatment or influence has no effect on lifespan or disease incidence. It is as convincing as disproof as it is unconvincing as proof. Think about it. As Einstein said: No amount of experimentation can ever prove me right; a single experiment can prove me wrong.
Epidemiological studies are useful for hypothesis-generating or for hypothesis-testing of theories already examined in experimental work but they do not enable causative inferences by themselves
The standard of reasoning that one commonly finds in epidemiological journal articles is akin to the following false syllogism:
Chairs have legs
You have legs
So therefore you are a chair
SALT -- SALT -- SALT
1). A good example of an epidemiological disproof concerns the dreaded salt (NaCl). We are constantly told that we eat too much salt for good health and must cut back our consumption of it. Yet there is one nation that consumes huge amounts of salt. So do they all die young there? Quite the reverse: Japan has the world's highest concentration of centenarians. Taste Japan's favourite sauce -- soy sauce -- if you want to understand Japanese salt consumption. It's almost solid salt.
2). We need a daily salt intake to counter salt-loss through perspiration and the research shows that people on salt-restricted diets die SOONER. So the conventional wisdom is not only wrong. It is positively harmful
3). Table salt is a major source of iodine, which is why salt is normally "iodized" by official decree. Cutting back salt consumption runs the risk of iodine deficiency, with its huge adverse health impacts -- goiter, mental retardation etc. GIVE YOUR BABY PLENTY OF SALTY FOODS -- unless you want to turn it into a cretin
THE SIDE-EFFECT MANIA. If a drug is shown to have troublesome side-effects, there are always calls for it to be banned or not authorized for use in the first place. But that is insane. ALL drugs have side effects. Even aspirin causes stomach bleeding, for instance -- and paracetamol (acetaminophen) can wreck your liver. If a drug has no side effects, it will have no main effects either. If you want a side-effect-free drug, take a homeopathic remedy. They're just water.
Although I am an atheist, I have never wavered from my view that the New Testament is the best guide to living and I still enjoy reading it. Here is what the apostle Paul says about vegetarians: "For one believeth that he may eat all things: another, who is weak, eateth herbs. Let not him that eateth despise him that eateth not; and let not him which eateth not judge him that eateth." (Romans 14: 2.3). What perfect advice! That is real tolerance: Very different from the dogmatism of the food freaks. Interesting that vegetarianism is such an old compulsion, though.
Even if we concede that getting fat shortens your life, what right has anybody got to question someone's decision to accept that tradeoff for themselves? Such a decision could be just one version of the old idea that it is best to have a short life but a merry one. Even the Bible is supportive of that thinking. See Ecclesiastes 8:15 and Isaiah 22: 13. To deny the right to make such a personal decision is plainly Fascistic.
Fatties actually SAVE the taxpayer money
IQ: Political correctness makes IQ generally unmentionable so it is rarely controlled for in epidemiological studies. This is extremely regrettable as it tends to vitiate findings that do not control for it. When it is examined, it is routinely found to have pervasive effects. We read, for instance, that "The mother's IQ was more highly predictive of breastfeeding status than were her race, education, age, poverty status, smoking, the home environment, or the child's birth weight or birth order". So political correctness can render otherwise interesting findings moot
That hallowed fish oil is strongly linked to increased incidence of colon cancer
"To kill an error is as good a service as, and sometimes better than, the establishing of a new truth or fact" -- Charles Darwin
"Most men die of their remedies, not of their diseases", said Moliere. That may no longer be true but there is still a lot of false medical "wisdom" around that does harm to various degrees. And showing its falsity is rarely the problem. The problem is getting people -- medical researchers in particular -- to abandon their preconceptions
Bertrand Russell could have been talking about today's conventional dietary "wisdom" when he said: "The fact that an opinion has been widely held is no evidence whatever that it is not utterly absurd; indeed in view of the silliness of the majority of mankind, a widespread belief is more likely to be foolish than sensible.”
Eating lots of fruit and vegetables is NOT beneficial
"Obesity" is 77% genetic. So trying to make fatties slim is punishing them for the way they were born. That sort of thing is furiously condemned in relation to homosexuals so why is it OK for fatties?
Some more problems with the "Obesity" war:
1). It tries to impose behavior change on everybody -- when most of those targeted are not obese and hence have no reason to change their behaviour. It is a form of punishing the innocent and the guilty alike. (It is also typical of Leftist thinking: Scorning the individual and capable of dealing with large groups only).
2). The longevity research all leads to the conclusion that it is people of MIDDLING weight who live longest -- not slim people. So the "epidemic" of obesity is in fact largely an "epidemic" of living longer.
3). It is total calorie intake that makes you fat -- not where you get your calories. Policies that attack only the source of the calories (e.g. "junk food") without addressing total calorie intake are hence pissing into the wind. People involuntarily deprived of their preferred calorie intake from one source are highly likely to seek and find their calories elsewhere.
4). So-called junk food is perfectly nutritious. A big Mac meal comprises meat, bread, salad and potatoes -- which is a mainstream Western diet. If that is bad then we are all in big trouble.
5). Food warriors demonize dietary fat. But Eskimos living on their traditional diet eat huge amounts of fat with no apparent ill-effects. At any given age they in fact have an exceptionally LOW incidence of cardiovascular disease. And the average home-cooked roast dinner has LOTS of fat. Will we ban roast dinners?
6). The foods restricted are often no more calorific than those permitted -- such as milk and fruit-juice drinks.
7). Tendency to weight is mostly genetic and is therefore not readily susceptible to voluntary behaviour change.
8). And when are we going to ban cheese? Cheese is a concentrated calorie bomb and has lots of that wicked animal fat in it too. Wouldn't we all be better off without it? And what about butter and margarine? They are just about pure fat. Surely they should be treated as contraband in kids' lunchboxes! [/sarcasm].
9). And how odd it is that we never hear of the huge American study which showed that women who eat lots of veggies have an INCREASED risk of stomach cancer? So the official recommendation to eat five lots of veggies every day might just be creating lots of cancer for the future! It's as plausible (i.e. not very) as all the other dietary "wisdom" we read about fat etc.
10). And will "this generation of Western children be the first in history to lead shorter lives than their parents did"? This is another anti-fat scare that emanates from a much-cited editorial in a prominent medical journal that said so. Yet this editorial offered no statistical basis for its opinion -- an opinion that flies directly in the face of the available evidence.
11). A major cause of increasing obesity is certainly the campaign against it -- as dieting usually makes people FATTER. If there were any sincerity to the obesity warriors, they would ban all diet advertising and otherwise shut up about it. Re-authorizing now-banned school playground activities and school outings would help too. But it is so much easier to blame obesity on the evil "multinationals" than it is to blame it on your own restrictions on the natural activities of kids
12. Fascism: "What we should be doing is monitoring children from birth so we can detect any deviations from the norm at an early stage and action can be taken". Who said that? Joe Stalin? Adolf Hitler? Orwell's "Big Brother"? The Spanish Inquisition? Generalissimo Francisco Franco Bahamonde? None of those. It was Dr Colin Waine, chairman of Britain's National Obesity Forum. What a fine fellow!
Trans fats: For one summary of the weak science behind the "trans-fat" hysteria, see here. Trans fats have only a temporary effect on blood chemistry and the evidence of lasting harm from them is dubious. By taking extreme groups in trans fats intake, some weak association with coronary heart disease has at times been shown in some sub-populations but extreme group studies are inherently at risk of confounding with other factors and are intrinsically of little interest to the average person.
The "antioxidant" religion: The experimental evidence is that antioxidants SHORTEN your life, if anything. Studies here and here and here and here and here and here and here, for instance. That they are of benefit is a great theory but it is one that has been coshed by reality plenty of times.
The medical consensus is often wrong. The best known wrongheaded medical orthodoxy is that stomach ulcers could not be caused by bacteria because the stomach is so acidic. Disproof of that view first appeared in 1875 (Yes. 1875) but the falsity of the view was not widely recognized until 1990. Only heroic efforts finally overturned the consensus and led to a cure for stomach ulcers. See here and here and here.
NOTE: "No trial has ever demonstrated benefits from reducing dietary saturated fat".
Huge ($400 million) clinical trial shows that a low fat diet is useless . See also here and here
Dieticians are just modern-day witch-doctors. There is no undergirding in double-blind studies for their usual recommendations
The fragility of current medical wisdom: Would you believe that even Old Testament wisdom can sometimes trump medical wisdom? Note this quote: "Spiess discussed Swedish research on cardiac patients that compared Jehovah's Witnesses who refused blood transfusions to patients with similar disease progression during open-heart surgery. The research found those who refused transfusions had noticeably better survival rates.
Relying on the popular wisdom can certainly hurt you personally: "The scientific consensus of a quarter-century ago turned into the arthritic nightmare of today."
Medical wisdom can in fact fly in the face of the known facts. How often do we hear reverent praise for the Mediterranean diet? Yet both Australians and Japanese live longer than Greeks and Italians, despite having very different diets. The traditional Australian diet is in fact about as opposite to the Mediterranean diet as you can get. The reverence for the Mediterranean diet can only be understood therefore as some sort of Anglo-Saxon cultural cringe. It is quite brainless. Why are not the Australian and Japanese diets extolled if health is the matter at issue?
Since many of my posts here make severe criticisms of medical research, I should perhaps point out that I am also a severe critic of much research in my own field of psychology. See here and here
This is NOT an "alternative medicine" site. Perhaps the only (weak) excuse for the poorly substantiated claims that often appear in the medical literature is the even poorer level of substantiation offered in the "alternative" literature.
I used to teach social statistics in a major Australian university and I find medical statistics pretty obfuscatory. They seem uniformly designed to make mountains out of molehills. Many times in the academic literature I have excoriated my colleagues in psychology and sociology for going ga-ga over very weak correlations but what I find in the medical literature makes the findings in the social sciences look positively muscular. In fact, medical findings are almost never reported as correlations -- because to do so would exhibit how laughably trivial they generally are. If (say) 3 individuals in a thousand in a control group had some sort of an adverse outcome versus 4 out of a thousand in a group undergoing some treatment, the difference will be published in the medical literature with great excitement and intimations of its importance. In fact, of course, such small differences are almost certainly random noise and are in any rational calculus unimportant. And statistical significance is little help in determining the importance of a finding. Statistical significance simply tells you that the result was unlikely to be an effect of small sample size. But a statistically significant difference could have been due to any number of other randomly-present factors.
Even statistical correlations far stronger than anything found in medical research may disappear if more data is used. A remarkable example from Sociology: below:"The modern literature on hate crimes began with a remarkable 1933 book by Arthur Raper titled The Tragedy of Lynching. Raper assembled data on the number of lynchings each year in the South and on the price of an acre's yield of cotton. He calculated the correlation coefficient between the two series at -0.532. In other words, when the economy was doing well, the number of lynchings was lower.... In 2001, Donald Green, Laurence McFalls, and Jennifer Smith published a paper that demolished the alleged connection between economic conditions and lynchings in Raper's data. Raper had the misfortune of stopping his analysis in 1929. After the Great Depression hit, the price of cotton plummeted and economic conditions deteriorated, yet lynchings continued to fall. The correlation disappeared altogether when more years of data were added."So we must be sure to base our conclusions on ALL the data. But in medical research, data selectivity and the "overlooking" of discordant research findings is epidemic.
The intellectual Roman Emperor Marcus Aurelius (AD 121-180) could have been speaking of the prevailing health "wisdom" of today when he said: "The object in life is not to be on the side of the majority, but to escape finding oneself in the ranks of the insane."
The Federal Reference Manual on Scientific Evidence, Second Edition says (p. 384): "the threshold for concluding that an agent was more likely than not the cause of an individual's disease is a relative risk greater than 2.0." Very few of the studies criticized on this blog meet that criterion.
Improbable events do happen at random -- as mathematician John Brignell notes rather tartly:
"Consider, instead, my experiences in the village pub swindle. It is based on the weekly bonus ball in the National Lottery. It so happens that my birth date is 13, so that is the number I always choose. With a few occasional absences abroad I have paid my pound every week for a year and a half, but have never won. Some of my neighbours win frequently; one in three consecutive weeks. Furthermore, I always put in a pound for my wife for her birth date, which is 11. She has never won either. The probability of neither of these numbers coming up in that period is less than 5%, which for an epidemiologist is significant enough to publish a paper.
One of the great pleasures in life is the first mouthful of cold beer on a hot day -- and the food Puritans can stick that wherever they like