What Happens When a Country Takes its Livestock Off Antibiotics?

Antibiotic-resistant bacteria infect two million Americans every year, causing at least 23,000 deaths. Even more die from complications related to the infections, and the numbers are steadily growing.

It’s now clear that we are facing the perfect storm to take us back to the pre-antibiotic age, when some of the most important advances in modern medicine – intensive care, organ transplants, care for premature babies, surgeries and even treatment for many common bacterial infections – will no longer be possible.

Experts have been warning about the implications of antibiotic resistance for years, but it’s time to face the facts. Many strains of bacteria are becoming resistant to even our strongest antibiotics and are causing deadly infections.

The bacteria are capable of evolving much faster than we are. Secondly, drug companies have all but abandoned the development of new antibiotics because of their poor profit margins.

Antibiotic Resistance: How Did This Happen?

Antibiotic overuse and inappropriate use – such as taking antibiotics to treat viral infections — bears a heavy responsibility for creating the antibiotic-resistant superbug crisis we are facing today.

According to Dr. Arjun Srinivasan, associate director of the US Centers for Disease Control and Prevention (CDC), as much as half of all antibiotics used in clinics and hospitals “are either unneeded or patients are getting the wrong drugs to treat their infections.”

There’s more to the story than this, however, as antibiotic overuse occurs not just in medicine, but also in food production. In fact, agricultural usage accounts for about 80 percent of all antibiotic use in the US, so it’s a MAJOR source of human antibiotic consumption.

Nearly 25 million pounds of antibiotics are administered to livestock in the US every year for purposes other than treating disease, such as making the animals grow bigger faster.

In other parts of the world, such as the EU, adding antibiotics to animal feed to accelerate growth has been banned for years. The antibiotic residues in meat and dairy, as well as the resistant bacteria, are passed on to you in the foods you eat.

Eighty different antibiotics are allowed in cows’ milk. According to the CDC, 22 percent of antibiotic-resistant illness in humans is in fact linked to food. In the words of Dr. Srinivasan:

“The more you use an antibiotic, the more you expose a bacteria to an antibiotic, the greater the likelihood that resistance to that antibiotic is going to develop. So the more antibiotics we put into people, we put into the environment, we put into livestock, the more opportunities we create for these bacteria to become resistant.”

This is a much bigger issue than antibiotics simply being left behind in your meat. For instance, bacteria often share genes that make them resistant. In other words, the drug-resistant bacteria that contaminates your meat may pass on their resistant genes to other bacteria in your body, making you more likely to become sick.

Drug-resistant bacteria also accumulate in manure that is spread on fields and enters waterways, allowing the drug-resistant bacteria to spread far and wide and ultimately back up the food chain to us. You can see how easily antibiotic resistance spreads, via the food you eat and community contact, in the CDC’s infographic below.

antibiotic-resistance

One-Third of the Most Dangerous Resistant Pathogens Are Found in Your Food

According to the CDC’s report, there are 12 resistant pathogens that pose a “serious” threat to public health. One-third of them are found in food. The four drug-resistant pathogens in question are:

  • Campylobacter, which causes an estimated 310,000 infections and 28 deaths per year
  • Salmonella, responsible for another 100,000 infections and 38 deaths annually
  • E. coli
  • Shigella

Previous research suggested you have a 50/50 chance of buying meat tainted with drug-resistant bacteria when you buy meat from your local grocery store. But it may be even worse. Using data collected by the federal agency called NARMS (National Antimicrobial Resistance Monitoring System), the Environmental Working Group (EWG) found antibiotic-resistant bacteria in 81 percent of ground turkey, 69 percent of pork chops, 55 percent of ground beef, and 39 percent of raw chicken parts purchased in stores in 2011. EWG nutritionist and the report’s lead researcher, Dawn Undurraga, issued the following warning to the public:

“Consumers should be very concerned that antibiotic-resistant bacteria are now common in the meat aisles of most American supermarkets… These organisms can cause foodborne illnesses and other infections. Worse, they spread antibiotic-resistance, which threatens to bring on a post-antibiotic era where important medicines critical to treating people could become ineffective.”

What Happens When a Country Takes Its Livestock Off Antibiotics?

In the US, concentrated animal feeding operations (CAFOs) are hotbeds for breeding antibiotic-resistant bacteria because of the continuous feeding of low doses of antibiotics to the animals, who become living bioreactors for pathogens to survive, adapt, and eventually, thrive. The European Centre for Disease Prevention and Control (ECDC) ruled that antibiotic resistance is a major threat to public health, worldwide, and the primary cause for this man-made epidemic is the widespread misuse of antibiotics.

Measures to curb the rampant overuse of agricultural antibiotics could have a major impact in the US, as evidenced by actions taken in other countries. For example, Denmark stopped the widespread use of antibiotics in their pork industry 14 years ago. The European Union has also banned the routine use of antibiotics in animal feed over concerns of antibiotic-resistant bacteria.

After Denmark implemented the antibiotic ban, it was later confirmed the country had drastically reduced antibiotic-resistant bacteria in their animals and food. Furthermore, the Danish ‘experiment’ proved that removing antibiotics doesn’t have to hurt the industry’s bottom line. In the first 12 years of the ban, the Danish pork industry grew by 43 percent — making it one of the top exporters of pork in the world. As reported by Consumer Reports:7

“What happens when a country takes its livestock off antibiotics? In 2000 Denmark’s pork industry ceased using antibiotics to promote the growth of its animals. Instead of eviscerating the nation’s pork industry, those moves contributed to a 50 percent rise in pork production, according to a 2012 article in the journal Nature.

Frank Aarestrup, D.V.M., Ph.D., head of the EU Reference Laboratory for Antimicrobial Resistance and author of the article, attributes Denmark’s success to three factors: laws banning the improper use of antibiotics, a robust system of surveillance and enforcement, and rules that prevent veterinarians from profiting from selling antibiotics to farmers.‘Farmers and their livestock can thrive without the heavy use of antibiotics,’ Aarestrup wrote. ‘With a little effort, I believe that other countries can and must help their farmers to do the same.’”

What’s Standing in the Way of Curbing Antibiotic Use in the US?

In a word, industry. For instance, the American Pork Industry doesn’t want to curb antibiotic use, as this would mean raising the cost of producing pork by an estimated $5 for every 100 pounds of pork brought to market. The pharmaceutical industry is obviously against it as well. Even though they’re not keen on producing new antibiotics to bring to the market, they want to protect those that are already here – especially those incredibly lucrative varieties that are used perpetually in animal feed. Even Dr. Aarestrup, who helped Denmark cut the use of antibiotics in livestock by 60 percent, wrote about the intense industry pressures he faced:

Reducing Denmark’s reliance on antibiotics was far from easy. My lab was visited by pharmaceutical executives who did not like what we were finding, and I would be cornered at meetings by people who disagreed with our conclusions. I have even been publicly accused of being paid to produce biased results. Despite such challenges, it has been satisfying to see that Danish farmers and their livestock can thrive without the heavy use of antibiotics. …The practice continues unabated in the United States, despite a statement from the Food and Drug Administration [FDA]… suggesting that farmers should stop voluntarily.”

FDA Again Fails to Take Appropriate Action on Agricultural Antibiotics

The FDA issued its long-awaited guidance on agricultural antibiotics on December 11, 2013. Unfortunately, it’s unlikely to have a major impact in terms of protecting your health. The agency is simply asking drug companies to voluntarily restrict the use of antibiotics that are important in human medicine by excluding growth promotion in animals as a listed use on the drug label. This would prevent farmers from legally using antibiotics such as tetracyclines, penicillins, and azithromycin for growth promotion purposes. But it certainly does not go far enough to protect public health. The guidance contains far too many loopholes for any meaningful protection.

For example, farmers would still be allowed to use antibiotics for therapeutic purposes, which would allow them to continue feeding their animals antibiotics for growth promotion without actually admitting that’s the reason for doing so. As reported byScientific American:

“[T]he success of the FDA’s new program depends on how many companies volunteer to change their labels over the next 90 days in alignment with the FDA cutoff period. (Companies that do change their labels will have three years to phase in the changes.) And then there are myriad questions about how this would be enforced on the farm.”

In short, while giving the superficial appearance of taking warranted action to protect public health, the reality is that they’re simply shills for the industry. Michael Taylor, FDA Deputy Commissioner for Foods and Veterinary Medicine, and former VP for public policy at Monsanto, is again responsible for caving in to industry at the expense of human lives.

Why Did FDA Ignore Risk Factors from the Very Beginning?

According to a recent report from the Natural Resources Defense Council (NRDC), the FDA has known that using antibiotics in factory farms is harmful to human health for over a dozen years, yet it took no action to curb its use. And now, all they’re doing is asking drug companies, who make massive amounts of money from these products, to voluntary restrict their use.

The report also found that 26 of the 30 drugs reviewed by the FDA did not meet safety guidelines issued in 1973, and NONE of the 30 drugs would meet today’s safety guidelines… As reported by Rodale Magazine, the FDA is supposed to look at three factors when determining the safety of an antibiotic-based feed additive. Based on the three factors listed below, the NRDC’s report concluded that virtually ALL feed additives containing penicillin and tetracycline antibiotics—both of which are used to treat human disease—pose a “high risk” to human health, and should not be permitted:

  1. The chances that antibiotic-resistant bacteria are being introduced into the food supply
  2. The likelihood that people would get exposed to those bacteria
  3. The consequences of what happens when people are exposed to those bacteria—would they still be able to get treated with human antibiotics?

Looking on the Brighter Side

The impending superbug crisis has a three-prong solution:

  1. Better infection prevention, with a focus on strengthening your immune system naturally
  2. More responsible use of antibiotics for people and animals, with a return to biodynamic farming and a complete overhaul of our food system
  3. Innovative new approaches to the treatment of infections from all branches of science, natural as well as allopathic

There are some promising new avenues of study that may result in fresh ways to fight superbugs. For example, Dutch scientists have discovered a way to deactivate antibiotics with a blast of ultraviolet light before bacteria have a chance to adapt, and before the antibiotics can damage your good bacteria.

And British scientists have discovered how bacteria talk to each other through “quorum signaling” and are investigating ways of disrupting this process in order to render them incapable of causing an infection. They believe this may lead to a new line of anti-infectives that do not kill bacteria, but instead block their ability to cause disease. But the basic strategy that you have at your disposal right now is prevention, prevention, prevention—it’s much easier to prevent an infection than to halt one already in progress.

Natural compounds with antimicrobial activity such as garlic, cinnamon, oregano extract, colloidal silver, Manuka honey, probiotics and fermented foods, echinacea, sunlight and vitamin D are all excellent options to try before resorting to drugs. Best of all, research has shown that bacteria do not tend to develop resistance to these types of treatments. The basic key to keeping your immune system healthy is making good lifestyle choices such as proper diet, stress management and exercise.

You Can Take Action to Help Save Antibiotics from Extinction

Avoiding antibiotic-resistance is but one of several good reasons to avoid meats and animal products from animals raised in concentrated animal feeding operations (CAFOs). This is in part why grass-fed pastured meat is the only type of meat I recommend. If you’re regularly eating meat bought at your local grocery store, know that you’re in all likelihood getting exposed to antibiotic-resistant bacteria and a low dose of antibiotics with every meal… and this low-dose exposure is what’s allowing bacteria to adapt and develop such strong resistance.

The FDA’s stance toward antibiotics in livestock feed is unconscionable in light of the harm it wreaks, and its weakness makes being proactive on a personal level all the more important. Quite simply, the FDA has been, and still is, supporting the profitability of large-scale factory farming at the expense of public health.

You can help yourself and your community by using antibiotics only when absolutely necessary and by purchasing organic, antibiotic-free meats and other foods from local farmers – not CAFOs. Even though the problem of antibiotic resistance needs to be stemmed through public policy on a nationwide level, the more people who get involved on a personal level to stop unnecessary antibiotic use the better. You can help on a larger scale, too, by telling the FDA we need a mandatory ban on sub-therapeutic doses of antibiotics for livestock—not weak, voluntary guidance.

FDA Deputy Commissioner and ex-Monsanto attorney Michael Taylor will leave quite a legacy behind. He’s not only served Monsanto and the other pesticide producers quite well, he seems to carry the same sentiment over to the antiobiotic crisis. The FDA claims that a voluntary guideline “is the most efficient and effective way to change the use of these products in animal agriculture.” It would appear that Taylor’s concern for human health takes a very distant back seat to industry profits…

To make  your voice heard, please sign the Organic Consumer’s Association’s petition, calling for a mandatory ban on sub-therapeutic doses of antibiotics for livestock.

Source : Wakingtimes

This entry was posted in News.

More Evidence of Ginger’s Medicinal Potency

Gingerroot has been known for its use as a spice in food preparation and for its many medicinal uses for centuries. This herbal product is touted for its many positive effects upon human health, including aiding digestion, improving immunity, and lowering inflammation.

Recently, however, more evidence has surfaced indi can actually lower blood sugar levels in people suffering from type 2 diabetes. The prevalence of type 2 diabetes has rapidly increased over the last decade with the concurrent changes in diet and lifestyle attributed to Westernized countries.

New research out of Iran shows that the use of ginger can actually lower blood sugar levels in people suffering from type 2 diabetes.

The report is based on the data from 88 people who were diagnosed with type 2 diabetes and had been living with the disease for 10 years. This group was randomly given either three grams of ginger powder in capsule form daily or a placebo capsule. In just eight weeks, those who had taken the ginger had experienced a decrease in fasting blood sugar from 171–150 mg/dl. In comparison, the group who had received the placebo treatment experienced a generalized increase in their fasting blood sugar levels.

Ginger can decrease blood sugar via various proposed mechanisms, including possible effects upon liver enzymes, which convert stored glucose in the form of glycogen into free glucose that can enter the bloodstream. Ginger can also improve blood sugar from its direct effects upon insulin receptors on our cells. Ginger is also a powerful anti-inflammatory agent that can decrease levels of inflammation, which adversely affects insulin sensitivity.

The intake of ginger may also affect the absorption rates of glucose from our food. Certain spices like ginger have been shown to slow the absorption of glucose from the foods we consume. The slower rates of absorption cause lower blood levels of insulin and better control of blood sugar regulation.

If you have type 2 diabetes and you want better control over your blood sugar despite the fact that you are taking medications, you may want to consider ginger. Three to four grams of fresh ginger daily may be all it takes to experience the full therapeutic effects. Consult with your professional healthcare provider first, though, as ginger can act as a natural blood thinner and may react negatively with certain medications.

Source : Naturalblaze

This entry was posted in News.

Surge of Attention to Crusading Economists Piketty’s New Book Takes Us to the Big Question of How to Reduce Inequality

Thomas Piketty’s new book, “Capital for the 21st Century,” has done a remarkable job of focusing public attention on the growth of inequality in the last three decades and the risk that it will grow further in the decades ahead. Piketty’s basic point on this issue is almost too simple for economists to understand: If the rate of return on wealth (r) is greater than the rate of growth (g), then wealth is likely to become ever more concentrated.

This raises the obvious question of what can be done to offset this tendency toward rising inequality? Piketty’s answer is that we need a global wealth tax (GWT) to redistribute from the rich to everyone else. That is a reasonable solution if we’re just working out the arithmetic in this story, but don’t expect many politicians to be running on the GWT platform any time soon.

If we want to counter the rise in inequality that we have seen in recent decades we are going to have to find other mechanisms for reversing this upward redistribution. Specifically, we will have to look to ways to reduce the rents earned by the wealthy. These rents stem from government interventions in the economy that have the effect of redistributing income upward. In Piketty’s terminology cutting back these rents means reducing r, the rate of return on wealth.

Fortunately, we have a full bag of policy tools to accomplish precisely this task. The best place to start is the financial industry, primarily since this sector is so obviously a ward of the state and in many ways a drain on the productive economy.

A new IMF analysis found the value of the implicit government insurance provided to too big to fail banks was $50 billion a year in the United States and $300 billion a year in the euro zone. The euro zone figure is more than 20 percent of after-tax corporate profits in the area. Much of this subsidy ends up as corporate profits or income to top banking executives.

In addition to this subsidy we also have the fact that finance is hugely under-taxed, a view shared by the IMF. Itrecommends a modest value-added tax of 0.2 percent of GDP (at $35 billion a year). We could also do a more robust financial transactions tax like Japan had in place in its boom years which raised more than 1.0 percent of GDP ($170 billion a year).

In this vein, serious progressives should be trying to stop plans to privatize Fannie and Freddie and replace them with a government subsidized private system. Undoubtedly we will see many Washington types praising Piketty as they watch Congress pass this giant new handout to the one percent.

The pharmaceutical industry also benefits from enormous rents through government granted patent monopolies. We spend more than $380 billion (2.2 percent of GDP) a year on drugs. We would spend 10 to 20 percent of this amount in a free market. We would not only have cheaper drugs, but likely better medicine if we funded research upfront instead of through patent monopolies since it would eliminate incentives to lie about research findings and conceal them from other researchers.

There are also substantial rents resulting from monopoly power in major sectors like telecommunications and air travel. We also give away public resources in areas like broadcast frequencies and airport landing slots. And we don’t charge the fossil fuel industry for destroying the environment. A carbon tax that roughly compensated for the damages could raise between $80 to $170 billion a year (0.5 to 1.0 percent of GDP).

This short list gives us plenty of places where we could pursue policies that would lower profits to the benefit of the vast majority of the population. And, these are all ways in which a lower return to capital should be associated with increased economic efficiency. This means that, unlike pure redistributionist measures like taxing the rich, we would have a larger pie that would even allow for some buying off of the losers.

These are the sorts of measures that economists usually try to seek out when the pain is inflicted on ordinary workers. Economists are big fans of trade agreements that arguably boost growth but lead to a loss of jobs and wages for manufacturing workers. For some reason economists don’t have the same interest in economic efficiency when the losers are the rich, but that is no reason the rest of us should not use good economic reasoning in designing an agenda.

In addition to the rent reducing measures listed above, there are redistributionist measures that we should support, such as higher minimum wages, mandated sick days and family leave, and more balanced labor laws that again allow workers the right to organize. Such measures should help to raise wages at the expense of a lower rate of return to wealth.

If this post-Piketty agenda sounds a great deal like the pre-Piketty agenda, it’s because the book probably did not change the way most progressives think about the world. The basic story is that income and wealth are being redistributed upward.

Piketty has produced an enormous amount of data to support what we already pretty much knew. This is very helpful. However the real question is how are we going to reverse this upward redistribution. For better or worse, Piketty pretty much leaves us back with our usual bag of tricks. We just might feel a greater urgency to use them.

Source : Alternet

This entry was posted in News.

The Dark Truth Behind The Popular Superfood, Quinoa

Quinoa is rising up the popularity charts as a food staple in U.S. and Europe. A growing spate of positive coverage cites quinoa (pronounced KEEN-wa) as a high-protein grain-like relative of spinach and beets which is a newly discovered gluten-free superfood. Its growing popularity has also spawned a growing source of controversy, following reports that high global quinoa prices put the crop out of reach for the people who grow it.

Many Americans want to get down to the bottom line: Should I eat it or not? Tanya Kerrsen, a Bolivia-based researcher for Food First who studies quinoa, thinks that is the wrong question.

“The debate has largely been reduced to the invisible hand of the marketplace, in which the only options for shaping our global food system are driven by (affluent) consumers either buying more or buying less,” she writes. “…whichever way you press the lever (buy more/buy less) there are bound to be negative consequences, particularly for poor farmers in the Global South.”

So what should you know about quinoa and its complex story?

Let’s begin by looking at the Bolivian Altiplano, the high flat plain in the Andes where quinoa originates, from the perspective you might have if you were to visit. Two and three miles above sea level, the Quechua (modern-day Inca) and Aymara (a people who pre-date the Inca) still live in the same place where they first domesticated quinoa as well as potatoes and many indigenous crops you’ve probably never heard of: oca, arracacha, kañawa, isaño, papaliza, and much more.

In the north, around Lake Titicaca, you’ll encounter warm days and cold nights. Altitudes range from 10,000 feet and up. Here you’ll observe a wide variety of crops and livestock: potatoes, barley, lima beans, sheep, pigs, dairy cows, and even guinea pigs (yes, raised for food). You might see some quinoa, but you’ll be hard-pressed to find a single llama, even though they were domesticated in this region.

Travel south and the temperature gets cold and the climate drier. Cows and sheep give way to llamas and alpacas. The land is covered in shrubs and grasses, although if you head over near the mountain Sajama you can see the world’s highest forest. Even still, the trees are tiny and stunted compared to what you might think of as a tree. Most crops would be unable to grow here, but the llamas and alpacas happily survive off of the native vegetation, as do their wild cousins, vicuñas. As a tourist, you might choose to visit the Salar de Uyuni, the world’s largest salt flats, which form an amazing landscape—literally, a sea of salt.

It is here in the Southern Altiplano, near the salt flats, where one finds quinoa growing for export.

In areas where Bolivians can grow more diverse crops and raise more profitable livestock, they do. But here, very little grows. According to Kerssen, “Quinoa was particularly well-suited to areas with ‘high climatic risk’ such as the southern Altiplano—able to withstand levels of drought, salinity, wind, hail, and frost in which other crops would perish.” The region is characterized by very little rainfall, more than 200 frost days per year, and poor soils.

The history of this part of the country, and its people, has been virtually dictated by its natural resources and climate. In pre-Columbian times, Andean peoples obtained balanced diets by trading extensively with their neighbors at other altitudes, often based on kinship ties.

This was disrupted when the Spanish found silver nearby in 1545. In the following centuries, an enormous percentage of the local population was conscripted into slave labor in the mines, and many never returned. The Spanish also set up haciendas in much of the country, in which the indigenous farmed to produce food and wealth for white landowners. The haciendas continued long after Bolivia’s independence, until the Bolivian revolution in 1952.

Between the mining and haciendas, traditional kinship ties and community organization was radically interrupted in much of the country. But the harsh climate and poor farming conditions for growing European crops kept the modern quinoa-growing region largely outside the hacienda system.

Through this time, land was managed communally. The flat grasslands were used as a grazing area for llamas and alpacas, and quinoa planting took place on hillside terraces, which Kerssen explains were “allocated by traditional authorities based on a family’s needs.” Remember, before the age of tractors, a family’s supply of labor corresponded to the number of mouths it had to feed. This traditional management system ensured that each field would be left fallow for many years following a quinoa crop to allow the nutrient-poor soil to recover fertility and to prevent pests and diseases.

For centuries and until recently, the indigenous people of Bolivia were typically either ignored by the outside world or oppressed and exploited by them. Even after the white minority in Bolivia established its own government independent of Spain, the indigenous remained an underclass in their own country. Healthy indigenous foods like llama and quinoa were looked down upon as “dirty” and Indian food, and the indigenous and their traditional ways were seen as a roadblock standing in the way of national development.

This is where our modern quinoa story starts.

The first change started in the 1970s. This was at the end of an era when U.S. Cold War policy hoped to stave off communism by introducing hybrid seeds, pesticides, fertilizers, irrigation, and tractors in the Global South.

The initial plan for Bolivia, drawn up by an American in the late 1940s, did not have high hopes for poor peasants of the southern Altiplano and the harsh environment they lived in. After the Bolivian Revolution in 1952, the new government successfully convinced the U.S. that they were all that stood between Bolivia going communist. The U.S. supported them with excessive amounts of aid; in some years U.S. aid accounted for a quarter of Bolivia’s national budget.

While the southern Altiplano was hardly the focus of this aid, they were not entirely cut off from it either. By the 1970s, the first tractors reached the quinoa-growing region. “The introduction of the tractor is the major game-changer for quinoa and for the transformation of the environment,” notes Kerssen.

This is for two reasons. First, the tractors cannot operate on terraced hillsides where quinoa was traditionally grown, so quinoa production moved to the flat pampas where llama herds traditionally grazed instead. Second, while they are much more efficient than farming with hand tools or plowing with draft animals, tractors are worse for soil fertility than their “primitive” alternatives.

That isn’t to say nobody should ever use a tractor, but it’s a factor to consider, particularly when farming on the “fragile, sandy and volcanic soils of the southern Altiplano, which are characterized by high salinity, a scarcity of organic matter, and low moisture retention capacity,” as Kerssen describes them. Additionally, the soils of the hillside terraces, where quinoa was previously grown, contain more clay, nutrients, and organic matter than the pampas. So tractors not only negatively impacts soil fertility, they also necessitate moving from areas of better soil to areas of worse soil.

Meanwhile, during this time, U.S. food aid imports were changing the national diet from traditional Andean foods to cheap, processed wheat products from the U.S. Even today, anyone visiting Bolivia will see vendors selling enormous bags of small white bread rolls on the streets. You’ll be lucky if you find anyone selling pan integral (whole wheat bread), as it’s not the norm.

Back then, when the first tractors appeared, Bolivia was still decades away from the quinoa boom. In the 1980s, the people suffered when, under U.S. influence, its government imposed severe economic austerity. With few prospects to make a living in the economically depressed southern Altiplano, many left. Those who remained had a hard time keeping up labor-intensive farming activities, like animal husbandry. And, of course, with less labor around, tractors became more necessary for those who could access them.

Quinoa export to the U.S. began in 1984. At first, it was not easy. Processing quinoa was done manually, and the end product might taste bitter if its bitter-tasting mildly toxic coating of saponins was not sufficiently removed. Back then, you might even find a small rock in your quinoa, which would have been threshed and winnowed by hand.

With little external support, a cooperative formed by quinoa producing communities, set out to find a better way. They traveled to Peru and Brazil to learn about processing machinery for other commodities, and attempted to build their own quinoa equipment based on a barley hulling machine. In the 1990s, the outside world stepped in. The United Nations financed construction of processing plants, and in 2005, the U.S. and Denmark helped develop new technologies to improve efficiency and quality.

Health-conscious readers in the U.S. already know the end of the story. Quinoa took off. After the prices paid to Bolivian farmers hovered around $500 per metric ton for decades, they skyrocketed to nearly $800 in 2008 and over $1300 in 2010.

With higher prices, families sold off llama herds to grow quinoa on former grazing land and to invest in tractors. The symbiosis of quinoa and llamas, taking and restoring fertility from the soil in turn, was broken. Instead of leaving a field fallow for many years after harvesting a crop, now Kerssen meets many farmers who grow quinoa on their land every other year, or even every year, without allowing the fragile land time to recover. And manure, once abundant from llamas, is now in short supply.

Kerssen has seen this first-hand. While she has seen studies claiming that quinoa can be grown sustainably with short fallow periods if the farmers restore the soil with lots of organic matter (i.e. manure), “then you see a lot of quinoa production… where the plants are very small, very stunted, not very much grain, the soil looks like sand, where it’s just clear that very little organic matter has been introduced to the soil,” she says.

“As the animal herds are reduced, the price of animal manure has gone up through the roof,” she continues. “You used to have very easy access to it in your community because almost every community had large herds. That [llama herding] was primarily what they did. But now manure has become this boom commodity and the de facto result of that is that it’s the more well-capitalized farmers with more money who can access the manure, and not the poorer farmers. So I think a lot of evidence points to a process of greater inequality.”

Another consequence is social, as those who have moved to the cities come back to cash in by growing quinoa on their family’s lands. These are folks who are no longer used to abiding by the rules of the rural communities and who continue to live in the cities, visiting their fields a few times a year to plant and harvest. In other words, they do not have the same stake in community that full time residents have.

“When you go through the southern Altiplano,” Kerssen recalls, “It still looks totally abandoned. It looks like it’s been bombed out. There are no kids in the schools and the homes are in very, very poor condition for the most part.” Locals, understandably, have little tolerance for urbanites and outsiders hoping to come to the area to make a quick buck in quinoa, and then take their money and leave.

“One thing communities have started saying,” Kerssen explains, is “if you’re going to be here and if you’re going to grow quinoa, then everyone is required to make certain kinds of investments. It’s decided in these community meetings, everything is by consensus.” Perhaps a community will decide that quinoa growers must invest their profits in building a decent bathroom for each family or into a local school. “It’s different in every community, but they’re making these self-regulations to ensure that the money does not totally leave the community.”

The last issue that is often raised is nutrition. Quinoa became popular because of its health value, yet the quinoa-growing region is the most malnourished in Bolivia because farmers cannot afford to eat their own crops. They sell their high-value quinoa and buy cheaper, less nourishing foods instead. Kerssen sees this as a problem with a history going back as far as the Spanish conquest, when the traditional system of trade among peoples of various altitudes and ecosystems was interrupted.

“I think a lot of pretty simplistic statements have been made which have really scandalized people about how the quinoa boom is pulling quinoa out of the reach of the producers who can no longer afford to eat it, and the situation is much more complex than that,” she says. “The reality is that Oruro and Potosi [the areas where quinoa is grown for export] have the highest rates of infant malnutrition probably in South America. Now is that caused by quinoa or caused by the quinoa boom?

“No, obviously not,” she answers, “Because the causes go very deep going back to the Spanish conquest and the isolation and marginalization of people who used to, through their social systems, have access to all kinds of foods—fruits, vegetables, fish from Lake Titicaca….” But, this is a region that will never have a healthy, diversified diet if they are limited to locally produced food.

Still, she sees truth in the notion that quinoa producers sell off their crops to buy cheaper foods like rice and pasta. “People say it’s more worthwhile for me, for my food security, to sell quinoa and to buy things that are cheaper like wheat and rice, that are less healthy but that fill you up… There’s no doubt in my mind that that is happening to some degree, but I think it’s important to emphasize that malnutrition and hunger and poverty go way back before quinoa was widely produced.”

“I think that there’s a development question that’s at the core of all development everywhere in the world that is no different here, in Africa, or in the US, which is does having a higher income necessarily lead to better health and a better quality of life? And we know that it doesn’t, or it doesn’t necessarily.”

Despite the controversies and the problems, Kerssen sees the quinoa boom as a victory for Andean peasants. “The peasants have been fighting for a market during the most brutal period of neoliberalism. But what’s also clear is that this has gotten away from them, and some things have happened now that they didn’t expect. Now they are dealing with the consequences of it.”

She feels troubled that American accounts of the story “either fall on the side of ‘the quinoa boom is amazing and it’s lifting people out of poverty’ or ‘the quinoa boom is terrible and is destroying people’s lives,’ and in both of those narratives the indigenous people are given no agency… If we know about quinoa at all in the north, it’s because of peasants really fighting anti-peasant policies during the most anti-peasant period… these people being like what can we do to survive on the land with our culture doing something that is culturally appropriate.”

Anyone who has visited Bolivia and studied its history knows not to discount the power of the local people, who built one of the most impressive civilizations in the New World and survived centuries of exploitation, keeping their cultures and ways of life largely intact, and ultimately ousting an unpopular, U.S.-backed president in 2003, leading to the election of their first indigenous president ever, Evo Morales.

Today, Kerssen sees many communities “especially in the traditional quinoa growing zone really taking seriously the issue of soil erosion, the issue of social conflicts, due not entirely to the quinoa boom but certainly exacerbated by it.” And, it won’t be surprising at all if they partner with local scientists and NGOs to overcome their problems and continue selling quinoa to the world.

So, given all this, back to the original question: should you buy quinoa or not? Kerssen thinks this question misses the point, reinforcing the idea “that we just need to blindly depend on marketing forces when really the struggle for food sovereignty and the right of farmers goes so far beyond that. It goes to the regulation of trade and the regulation of the food supply and education in Bolivia, around the native crops.”

She concludes, saying, “The fact of the matter is that pretty much everywhere in the world, food that’s produced by peasants, especially native foods, have never gotten any support over, and… one of these foods has now become globally profitable.”

Source : Alternet

This entry was posted in News.

Chelsea Clinton’s Pregnancy Gives Birth to New Right-Wing Conspiracies

Exciting news from the American political scene: birtherism is back! No, not birtherism as in the nonsensical conspiracy theory about Barack Obama actually being born in Kenya, once so popular among political geniuses including Donald Trump. Please – that kind of birtherism is so 2008 (and 2009. And 2010. And 2011. And 2012 …) I’m talking about the all new, all shiny 2014 birtherism: Clinton birtherism!

Chelsea Clinton, as you may have heard, is expecting a baby with her husband, Marc Mezvinsky. What’s that you say? “Awwwww”? “Bless the happy couple”? Well, I’m afraid you’re just not trying hard enough because, over in the US, the noise that greeted this happy announcement was the sound of a million axes being ground.

First came the inevitable weirdy-weirdos who are so incapable of thinking about anything other than women’s reproductive organs for more than two seconds that they managed to turn Clinton’s birth announcement into a debate about abortion: “It’s no secret that the Clintons support abortion,” intoned an editorial in the Christian Post sorrowfully. “But when it’s their own grandchild, it appears the Clintons see things differently. No talk of a non-person foetus, only of a child.” Yes, the sheer hypocrisy! Imagine, wanting to give women access to abortion, but not actually wanting to abort every single pregnancy? Seriously, my brain is bursting trying to compute this nonsensical contradiction.

Next came the new birthers who saw not the happy miracle of life but mere campaigning. Newsmax’s Steve Malzberg kicked off proceedings nicely with the statement: “When I say [the pregnancy is] staged I have to believe she’s pregnant, if she says she’s pregnant.” That’s generous of you, Steve! Pray, continue: “But what great timing! God answeredHillary Clinton’s prayers and she’s going to have the prop of being a new grandma while she runs for president.”

Yes, the prop of one’s 34-year-old married daughter having a baby! Whoever heard of such a crazy scenario? Those scheming Clintons really will do anything to get ahead. Bill may well, in his day, have played a role in various women’s sex lives, but who knew he could command his own daughter to conceive? Others echoed Malzberg’s sentiment, with much talk of Hillary Clinton deliberately “softening her image” and “adding compassion” to her persona via the medium of her unborn grandchild. People who actually seem to believe this span the political spectrum, ranging from conservative writer Michael Goldfarb, who tweeted: “Just in time for HRC [Hillary Rodham Clinton] to have a baby on stage with her when she announces she’s running, right?”, to New York Times columnist Andrew Ross Sorkin, who said on MSNBC: “Can we talk about the human drama that is Grandma Clinton? I’m not suggesting that anyone’s having a baby for election purposes, but …” Gotta love that “but”! And what is this “human drama … Grandma Clinton”? Is it a new HBO series? Because I’ve been looking for a new one since Breaking Bad finished.

But by far the biggest debate is this: can Hillary Clinton run for president AND be a grandmother? Oh, women – always wanting to have it all, aren’t they? Well-known US TV journalists have expressed disbelief that the two roles could possibly be combined: “President or grandmother?” asked Charlie Rose, who somehow manages to combine being a prominent interviewer while also being an occasional ass. On NBC, David Gregory fretted over whether Clinton’s impending grandmotherhood will “factor into” her decision to run. “I don’t know, David. Did the birth of your three children make you worse at your job?”snapped Erin Gloria Ryan at Jezebel.com. Sadly, Gregory has yet to respond. “Perhaps it’s sexist even to ask the question,” began an editorial in the Christian Science Monitor, promisingly. “If we had to guess, we’d say that Hillary Clinton will be a tad less interested in running for president now that she’s about to be a grandmother.” Yes, you know what those old ladies are like: one hint of a grandchild and they’re too busy knitting booties to bother worrying about world affairs.

Who knew being a grandmother was such an all-encompassing job; one that prevented a woman from doing anything else? As it happens, the Sunday Times reminded readers of this fact when, in a front page headline, it described the distinguished Margaret Archer, head of the Pontifical Academy of Social Sciences, simply as “Grandmother, 71″. Whether stories about, say, David Attenborough will now be headlined in the Times “Grandfather, 88″ have not yet been confirmed. But then, women must always be defined by their fertility, or even their children’s fertility, whereas men are, of course, defined by their jobs.

Just as old-school birtherism – “Barack Obama was actually born in Kenya!” – was really just code for “Holy hell, we can’t possibly have a black man as president!”, so the current birtherism – “Hillary can’t possibly be a grandmother and run for office!” – is really just code for “Holy hell, we can’t possibly have a woman as president, and an OLD one to boot!” Mitt Romney, to take one random and deeply obvious example, is the proud grandfather of 23 grandchildren, two of whom were born while he was on the campaign trail, and not once was this raised as a potential disadvantage. Many presidents have been grandfathers in office, although it’s actually quite hard to figure out how many because, guess what? No one ever gave a good God damn, so no one kept track. Funnily enough, Jeb Bush, who may well be Clinton’s opponent in 2016, is also a grandfather, but that has yet to be raised as an issue against him. Then again, he does come with other baggage. So Clinton can console herself with this thought: if the most idiots can knock her for is being a grandmother, she probably has that election sewn up.

Source : Alternet

This entry was posted in News.

Airbnb’s Big Bait-and-Switch

Airbnb was in court in New York on Tuesday, battling with the state attorney general over the question of how much information about its “hosts” the company should be required to reveal to the state. The case has attracted much attention, because New York is simultaneously one of Airbnb’s biggest markets, and its most fiercely fought battleground with regulators. There’s plenty at stake — not least of which is Airbnb’s brand-new $10 billion valuation, following a $500 million round of investment that closed last week. In 2013, Airbnb pulled in $250 million in revenue from its thousands of hosts. That’s real money.

Real money, in the “sharing” economy.

Airbnb and its advocates argue, with some justice, that the current laws regulating short-term rentals are outmoded and don’t fit the new business models nurtured by cloud computing and smartphones. That’s undoubtedly true, though as the New York Times’ David Streitfeld wrote in a smart piece on Tuesday, the regulatory mess is at least partially the result of the fact that for “sharing economy” companies, “questions about safety, taxes and regulation have tended to be an afterthought.” Silicon Valley sees no problem with breaking the law first, and then lobbying to fix it later.

A “crowd-funded” ad organized by Peers.org, an outfit that advocates for sharing economy companies and was co-founded by an executive of Airbnb, declares that “we’re asking lawmakers for more sharing, not less. We want to play by the rules, but New York needs laws that are safe, fair, and clear. Support sharing. Fix the law.”

It would be nice if the Peers ad clarified that in the case of Airbnb, “sharing” actually means “short-term rentals,” but that horse left the barn a long time ago. Whatever we call  what happens when one person rents out living space to another, there is little doubt that there is significant political support, particularly among younger people, for a regulatory structure that is more friendly to how Airbnb conducts business. In Silver Lake, a Los Angeles neighborhood that has been witnessing its own Airbnb regulatory showdown, a slate of supporters of Airbnb rentals won election to a neighborhood council last week.

But everyone agitating for laws more friendly to Airbnb should take a closer look at some of the details of the New York legal struggle. In the New York Times, Streitfeld points out that when the state attorney general’s office investigated exactly who was posting listings on the Airbnb platform, it discovered a funny thing.

On Jan. 31, there were 19,522 listings for New York City properties on Airbnb from 15,677 hosts, according to data the attorney general submitted to the court. But nearly a third of the listings were from only 12 percent of the hosts.

One Airbnb landlord had 127 listings in Manhattan on a single weekend last fall. Sixteen other landlords had at least 15 listings each.

By no stretch of the imagination can this be properly considered the “sharing economy.” The data prove that nearly a third of Airbnb’s New York listings were generated by landlords who were cashing in on the ability to make a profit from offering short-term hotel rentals without having to pay the normal costs borne by the hotel sector — taxes, safety compliance and so on.

Now, if I were one of the people mobilized by Peers.org to contact my congressional representative or state senator to lobby for laws more accommodating to Airbnb, I might look at those numbers and wonder what, exactly, I’ve been supporting.

Airbnb responded to the data with a purge. On Sunday, Airbnb’s director of public policy, David Hantman, noted in a blog post that the company was rushing to clean up its listings.

But when we examined our community in New York, we found that some property managers weren’t providing a quality, local experience to guests. These hosts weren’t making their neighborhood stronger and they weren’t delivering the kind of hospitality our guests expect and deserve. In some cases, they were making communities worse, not better. We took a hard look at our community in New York to identify these hosts and we took action.

Earlier this year, we began notifying these hosts that they and their more than 2,000 listings would be permanently removed from the Airbnb community. While we are allowing these hosts to support their existing bookings, all are now prohibited from accepting new reservations and if you search for a place to stay in New York, you won’t find these listings.

Fair enough. But let’s not be naive. Would Airbnb be engaging in a mass cleanup of its listings if it hadn’t been on the receiving end of close government scrutiny? Remember — New York is one of Airbnb’s largest markets and the company raked in revenue of $250 million in 2013. A significant chunk of that revenue must have come from absentee landlords maximizing their income from multiple properties. That’s real money, and Airbnb was happy to accept it … until it became politically unfeasible to do so.

Why is any of this important? Because it’s worth understanding that over the last couple of years, when Airbnb has rallied political support for its business model by declaring, flat out, that it is pioneering a better way for humans to relate to each other, that it has been building trust between strangers and helping people struggling in a tough economy earn some extra dollars, and that “sharing” a room is somehow morally better than paying for a hotel room, what the company has also been doing is making quite a bit of money off absentee landlords who have been exploiting its platform to offer hotel-like services without conforming to hotel regulations. And it seems pretty clear, in the New York case, that the company didn’t start seriously cleaning up its listings until real political pressure was brought to bear.

Airbnb provides a service that people clearly want. It is inevitable that the rules will be modified to provide room for its operations. Cloud computing and smartphones allow for much more efficient allocation and coordination of resources, and the existing hotel industry will be forced to adapt. That’s cool.

But we’ve also been told, repeatedly, that we should trust the sharing economy to regulate itself, because, somehow, it will be in the market’s best interest not to misbehave. But the record shows that the short-term rental economy will not regulate itself; that it will, instead, seek to make as much money as it can while the getting is good. Until someone pays attention.

Source : Alternet

This entry was posted in News.

Study Suggests Vegetarians are Less Healthy, more Prone to Disease than Meat-Eaters

What you eat has both immediate and long-term effects on your health. What foods produce the best health results have been a point of contention since as far back as we can remember. Generally, no one argues that a diet rich in vegetables is a bad diet, but a new study says that those completely abstaining from non-vegetarian foods (meat, dairy, etc.) —vegans and vegetarians—may be less healthy than their meat-eating counterparts.

vege girl disease 263x164 Study Suggests Vegetarians are Less Healthy, more Prone to Disease than Meat Eaters

The research from the Medical University of Graz and published in PLoS ONE says that although vegetarians are more likely to make healthy lifestyle choices, they are also more likely to suffer from allergies, heart attacks, and cancer—a sharp difference from the picture of vegetarian health we most commonly see.

CBS Atlanta reports that not all the news was bad for vegetarians—they are more physically active, smoke less tobacco, and drink less alcohol. They also are more likely to have a higher socioeconomic status and a lower body mass index (BMI).

But the study says the benefits begin and end there.

Vegetarians in the Austrian Health Interview Survey were:

  • Twice as likely to suffer from allergies, and
  • Had a 50 percent greater risk of heart attacks and cancer.
  • “Found to be in a poorer state of health compared to other dietary groups,” reporting higher incidence of chronic disease, impairment from disorders, and more anxiety and depression.

The study abstract concluded:

 “Our study has shown that Austrian adults who consume a vegetarian diet are less healthy (in terms of cancer, allergies, and mental health disorders), have a lower quality of life, and also require more medical treatment,” wrote the study authors. “Therefore, a continued strong public health program for Austria is required in order to reduce the health risk due to nutritional factors.”

While my first question about the study was who was funding it, the researchers say they had no funding to report and that no “competing interests” existed.

In contrast, the Centers for Disease Control and Prevention (CDC) note that a diet rich in vegetables and fruits can reduce the risk of cancer and other chronic diseases. Numerous other studies have affirmed this stance, and the most recent research stands out as being abnormally contrary.

Source : Naturalsociety

This entry was posted in News.

Have a Cough? Pineapple Juice Found to be 5x more Effective than Cough Syrup

As a reader of NaturalSociety, you’re likely someone always looking for a natural alternative for over-the-counter drugs and treatments. When it comes to an upset stomach, you may try ginger or peppermint tea instead of the “pink stuff”. If you have a headache, you can reach for lavender essential oil or try some acupressure. And now, when you have a cough, you can add something new to your arsenal—pineapple juice.

pineapples cough 263x164 Have a Cough? Pineapple Juice Found to be 5x more Effective than Cough Syrup

Pineapples contain copious amounts of vitamin C, manganese, and something known as bromelain, an enzyme that fights inflammation. Said to prevent unhealthy blood clots and improve digestion, bromelain works on a variety of fronts to encourage healing and discourage illness. It has antiviral, anti-inflammatory, and antibacterial properties. Together, these components can sooth a cough by calming the airways and fighting infection.

One study found that pineapple is actually more effective at quelling a cough than cough syrup—up to 5 times more effective.

The vitamin C in one cup of pineapple juice is half of the daily recommended amount. Vitamin C, as you may know, is crucial in building a strong immune system and fighting infection. The bromelain acts as an anti-inflammatory that can soothe irritated airways.

In a 2010 issue of Der Pharma Chemica, researchers reported a mixture of raw pineapple juice, pepper, salt, and honey was able to dissolve mucus in the lungs of tuberculosis patients. Another found extracts from the fruit could cut mucus notably faster than over-the-counter syrups, reducing all related symptoms.

Pineapple Cough Suppressant Recipe

  • 1 cup of fresh pineapple juice
  •  1/4 cup of fresh lemon juice
  • 1 piece of ginger (about 3 inches)
  • 1 Tbsp raw honey
  • 1/2 tsp cayenne pepper

Additional benefits of pineapple juice include: sore throat relief, arthritis and injury pain management, and the ability to lower the time it takes for blood to coagulate—making it a good option for most heart patients.

But going to the grocery store and grabbing the first carton of pineapple juice you see is not the best way to get all of these benefits. Not only will heat used to process that juice potentially destroy the bromelain, but these prepared products are typically loaded with sugar and other unnecessary ingredients.

If you have a juicer, put it to work. You can also eat the fruit whole.

Source : Naturalsociety

This entry was posted in News.

Canada Delays Spring GMO Alfalfa Release

The decision to keep genetically modified crops from being sown is sweeping the globe. After valid concerns about cross-contamination between GMO and non-GM crops have been presented by the National Farmers Union of Canada, from Ontario to Manitoba, Quebec to the Yukon, a huge turn of events has ensued. For example, RoundUp Ready alfalfa will not be available to Canadian growers this spring.

Many in Canada have been concerned about the approval process for GM crops, and after activists and farmers notched up the pressure, a commercial provider of GM alfalfa has held back its GM creation.Forage Genetics International (FGI), lead by Mike Peterson, confirmed that they would not be releasing GM alfalfa this planting season:

crop field alfalfa Delay 263x164 Canada Delays Spring GMO Alfalfa Release

“For spring of 2014, Forage Genetics will not be commercializing Roundup Ready alfalfa in Canada anywhere. That’s the only decision we’ve made so far. We’re just going day to day on the decision making process, but we have made a decision about spring.”

This is a significant development since the Canadian version of the US FDA – the CFIA or Food Inspection Agency – grated registration of the GM variety in 2013 and gave FGI exclusive rights (which includes seed patents) to begin commercialization of the crop this year. Many in the biotech industry were looking to FGI’s planting rights as an open door to get future GMO crops into Canada. FGI announced in The Western Producer in March that it would not make its seed available; many officials were surprised.

FGI and biotech planned on using GM alfalfa as the ultimate weed control, since Monsanto’s RoundUp chemicals can be sprayed at liberty, choking back all sorts of flowers, weeds, and herbs that interfere with mono-cropping. The alfalfa seed created by FGI was specifically manufactured to be resistant to RoundUp chemicals (glyphosate primarily), so that it could dominate any field where it was planted.

Eighty percent of all alfalfa grown in Canada is currently grown in the western territories. A trial was set to release the biotech crop in far eastern Canada as a way to try to prevent cross-pollination, but FGI has halted the entire alfalfa endeavor in Canada. Why the sudden change? FGI was perhaps smart enough to listen to farmers saying they didn’t want their GM alfalfa, but more likely, groups like Quebec’s general farm group, Union des producteurs agricoles (UPA), and other activists are keeping biotech on their toes. The UPA passed a motion last February to block the marketing of GM alfalfa in Quebec to stop potential cross-pollination

“It’s the right decision to keep GM alfalfa off the market this spring and every spring in the future,” said Canadian Biotechnology Action Network coordinator Lucy Sharratt. “It’s great if Forage Genetics is actually listening to farmers on this issue. The government certainly didn’t. . . We’ve heard a lot of opposition from Ontario and Quebec farmers to release of GM alfalfa, so it in fact looks like farmers across Canada are asking the company not to put this product on the market, and I hope that this is actually a response to that voice.”

Source : Naturalsociety

This entry was posted in News.

Minute Maid’s Pomegranate Blueberry Juice is 99% Apple and Grape Juice

Coca-Cola and Pom Wonderful are gearing up for a court battle, a lawsuit where marketing and the public’s right to know exactly what’s in their food is front and center. At issue is a juice from Coca-Cola’s Minute Maid division, and Pom Wonderful’s allegations that the juice’s label is completely misleading. In fact, it is very misleading.

pomegranate minute maid coca cola 263x164 Minute Maid’s Pomegranate Blueberry Juice is 99% Apple and Grape Juice

If a juice is labeled as a Pomegranate Blueberry juice blend and has a photo of pomegranate fruit and blueberries figuring prominently in front of apples, grapes, and raspberries, one would gather that the juice likely has as much if not more pomegranate and blueberry juice as it does other fruit juices in the blend. One would think. But as we know, food producers are quite crafty with their marketing and labeling, and they really aren’t concerned with how deceptive practices can mislead conscientious consumers.

The label on Minute Maid’s Pomegranate Blueberry Flavored Blend of five juices is misleading. Pomegranates and blueberries are the only fruits mentioned in the title. Their photos are prominent on the label. But the juice is 99.4 percent apple and grape-derived, containing only 0.3 percent pomegranate and 0.2 percent blueberry juice.

Pom Wonderful first sued Coca-Cola under the Lanham Act in 2008, an act that only allows companies to take action against other companies. For Pom Wonderful, the fight is personal since their pomegranate juice is 100 percent pomegranate.

Consumers purchasing the Minute Maid blend are likely to think they are getting a juice with a fair amount of pomegranate. But only if they are careful enough to read the back label will they realize pomegranate is far from the first ingredient.

Side note: If you aren’t reading the labels on the back of your foods, you are doing your health a disservice.

So far, the Food and Drug Administration has allowed labeling practices like that on the Minute Maid juice, letting food companies name beverages according to the flavor rather than the actual content. The nation’s High Court will now decide if Pom has a right to sue Coca-Cola under the Lanham Act and how federal laws affect the battle.

Bloomberg Businessweek says that if the Supreme Court sides with Pom on the issues, it could mean increasing legal scrutiny for the makers of processed foods and their deceptive labeling practices—something we should all be hoping for.

Source : Naturalsociety

This entry was posted in News.