Sunday, September 1, 2013

Cancer-Causing Chemical Found in 98 Shampoos and Soaps

Tests ordered by an environmental watchdog group revealed the presence of a cancer-causing chemical in dozens of personal care products that lack a warning label required by California law.

The compound, a chemically modified form of coconut oil—cocamide diethanolamine (cocamide DEA)—is used as a foaming agent or thickener in soaps, shampoos, conditioners, and similar products.

Carcinogenic Ingredients in Your Personal Care Products?

No Warning Labels

An independent laboratory commissioned by the Center for Environmental Health (CEH) tested the products to determine how much cocamide DEA was present. CEH purchased these products after June 2013 from online and local California retailers, such as Trader Joe’s, Walmart, Kohl’s, and Babies R Us.

Many of the products tested contained more than 10,000 parts per million (ppm) of cocamide DEA. In all, CEH identified 98 products with cocamide DEA among the ingredients, none of which carried the warning required by state law.

"The state has not set a [safety] level specific to cocamide DEA," says Charles Margulis, Communications Director and Food Program Director of CEH, "but the levels we found exceed levels typical for carcinogens."

What's in Your Beauty Products?

To comply with California’s Proposition 65, companies are still required to provide a "clear and reasonable" warning to consumers when products they sell or produce contain chemicals listed by the state as harmful. This includes compounds known to cause cancer or birth defects.

Cocamide DEA was added to the California list of harmful chemicals in 2012 after the International Agency for Research on Cancer (IARC) published its review of the chemical’s safety, which was based upon skin exposure tests in animals. "There is sufficient evidence in experimental animals for the carcinogenicity of coconut oil diethanolamine condensate," the agency writes.

Environmental Group Files Lawsuit

In response to the laboratory results, the CEH filed a lawsuit on Tuesday against four companies—Walgreens, Lake Consumer Products, Ultimark Products, and Todd Christopher International.

"Our demand is that companies reformulate their products, without cocamide DEA,” says Margulis. “There are many similar shampoos and soaps on the market made without the chemical, so it is obviously possible to make the products safer."

The CEH also sent legal letters advising more than 100 other companies producing or selling products containing the chemical that their products violate Proposition 65.

In the lawsuit, which was filed in California Superior Court in Alameda County, the CEH accuses the companies of "knowingly and intentionally exposing individuals to cocamide DEA without first giving clear and reasonable warnings to such individuals regarding the carcinogenicity of cocamide DEA."

Dangerous Ingredients to Watch Out For in Cosmetics

The lawsuit asks the court to fine the companies $2,500 a day for each violation and prevent them from selling products containing cocamide DEA in California without a clear warning label.

The CEH hopes these short-term actions, along with their continuing efforts, will have an even wider effect.

"Under the law, companies can simply label," says Margulis, "but we’ve had hundreds of Prop 65 cases over 17 years of doing this work, and in over 95 percent of these cases, we have won legally binding agreements that require companies to reformulate their products. We expect the same in these cases."

Source: Yahoo

Holistic Farming Hero Joel Salatin Gives Helluva Talk at TEDMED 2012

Insanity: new Google Glass app will read other people’s emotions



If you’ve ever studied infomercials, you know the whole business is based on back-end sales. It’s not the product you buy for $19.95, it’s the products they can hook you into after you spend the $19.95.

So it is with Google Glass. It’s all about the apps that’ll be attached. .

Glass gives the wearer short-hand reality as he taps in. That’s what it’s for. The user is “on the go.” If he’s driving his Lexus and suddenly thinks about Plato, he’s not going to download the full text of The Republic to mull while he’s crashing into big trucks on the Jersey Turnpike. He’s going to take a shorthand summary. A few lines.

People want boiled-down info while they’re on the move. Reduction. The “essentials.”

This is perfectly in line with the codes of the culture. Ads, quick-hitter seminars, headlines, two-sentence summaries, ratings for products, news with no context. Stripped-down, reduced.

Well, here is a look into right now. A student at Stanford is developing a Google app that “reads other people.”

From SFGate, 8/26, “Google Glass being designed to read emotions”: “The [emotion-recognition] tools can analyze facial expressions and vocal patterns for signs of specific emotions: Happiness, sadness, anger, frustration, and more.”

This is the work of Catalin Voss, an 18-year-old student at Stanford and his start-up company, Sension.

So you’re wearing Google Glass at a meeting and it checks out the guy across the table who has an empty expression on his mug and, above your right eye, you see the word “neutral.” Now he smiles, and the word “happy” appears.

I kid you not. This information is supposed to guide you in your communication. The number of things that can go wrong? Count the ways, if you’re able. I’m personally looking forward to that guy across the table saying, “Hey, you, schmuck with the Glass, what is your app saying about me now? Angry?” That should certainly enhance the communication.

Or a husband, just back from his 12-mile morning bike ride, enters his Palo Alto home, wearing Glass, of course, and as he looks at his wife, who is sitting at the kitchen table reading a book, sees the word “sad” appear above his eye. “Honey,” he says, recalling the skills he picked up in a 26- minute webinar, “have you been pursuing a negative line of thinking?”

She slowly gazes up at the goggle-eyed monster in his spandex and grasshopper helmet, rises from her chair and tosses a plate of hot eggs in his face. YouTube, please!

But wait. There’s more. The Glass app is also being heralded as a step forward in “machine-human relationships.” With recognition services like Google Now and Siri, when computers and human users talk to each other, the computers will be able to respond not only to the content of the user’s words, but also to his tone, his feelings.

This should be a real marvel. As you’ve no doubt already realized, the emotion-recognition tool is all about reduction. It shrinks human feelings to simplistic labels. Therefore, what machines say back to humans will be something to behold.

Machine version of NLP, anyone? I’m predicting a surge in destroyed computers.

The astonishing thing about this new app is that many tech people are so on-board with it. In other words, they believe that human feelings can be broken down and worked with on an androidal basis, with no loss incurred. These people are already boiled down, cartoonized.

You think you’ve observed predictive programing in movies? That’s nothing. The use of apps like this one will help bring about a greater willingness on the part of humans to reduce their own thoughts and feelings to…FIT THE SPECS OF THE MACHINES AND THE SOFTWARE.

Count on it.

This isn’t really about machines acting more like humans. It’s about humans acting like machines.

The potential range of human emotions is extraordinary. Our language, when used with imagination, actually extends that range. It’s something called art.

The counter-trend is in gear. No matter how subtle the emotion-recognition algorithms become, there will always be a wide, wide gap between what they produce and the expression of humans.

The most profound kind of mind control seeks to eliminate that gap by encouraging us to mimic technology. That means people will think and feel less, and what they think and feel will mean less.

The machines won’t say, “I’m sorry, I can’t identify that emotion, it’s too complex.” They’ll say “sad” or “happy” or “upset” or whatever they have to say to give the appearance that they’re on top of the human condition.

Eventually, significant numbers of people will tailor their self-awareness to what the machines point to, name, label, declare.

Thus, inventing reality.

The wolf becomes a lamb, the lamb becomes a flea.

And peace prevails. You can wear it and see with it.

Eventually, realizing that Glass is too obvious and obnoxious and bulky, companies will develop something they might call Third Eye, a chip the size of half a grain of rice, made flat, and inserted under the skin of the forehead.

Perfect. Invisible. Of course, cops will have them. And talk to them.

“I’m parked at the corner of Wilshire and Westwood. Suspicious male standing outside the Harmon Building.”

“I see him. Searching relevant data.”

Which means any past arrests, race, conditions noted in his medical records, tax status, questionable statements he’s made in public or private, significant known associates, group affiliations, etc. And present state of mind.

The cop: “Recommendation?”

“Passive-aggressive, right now he’s peaking at 3.2 on the Hoover Bipolar scale. Bring subject into custody for general questioning.”

“Will do.”

No one will wonder why, because such analysis resonates with the vastly reduced general perception of what reality is all about.

People mimic how machines see them and adjust their human thinking accordingly.

Hand and glove, key and lock. Wonderful.

As the cop is transporting the suspect to the station, Third Eye intercedes:

"Sorry, Officer Crane, it took me a minute to dig further. Suspect is business associate of REDACTED. This is a catch and release. Repeat, catch and release. Printing out four backstage passes to Third Memorial Rolling Stones concert at the Hollywood Bowl. Apologize profusely, give subject the tickets, and release him immediately.”

“I copy.”

“This arrest and attendant communication is being deleted…now.”

Source: Activist Post

the worst is now Mainstream from Fuku; the Leaks now are equal to the event